Nov 25 15:58:39 crc systemd[1]: Starting Kubernetes Kubelet... Nov 25 15:58:39 crc restorecon[4690]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:58:39 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:58:40 crc restorecon[4690]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:58:40 crc restorecon[4690]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 25 15:58:41 crc kubenswrapper[4743]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 15:58:41 crc kubenswrapper[4743]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 25 15:58:41 crc kubenswrapper[4743]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 15:58:41 crc kubenswrapper[4743]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 15:58:41 crc kubenswrapper[4743]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 25 15:58:41 crc kubenswrapper[4743]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.345128 4743 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359151 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359208 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359221 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359231 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359241 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359252 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359264 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359275 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359288 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359303 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359315 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359325 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359333 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359341 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359350 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359359 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359368 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359377 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359385 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359394 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359402 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359410 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359418 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359427 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359435 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359443 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359452 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359460 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359468 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359477 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359485 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359494 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359517 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359527 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359535 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359544 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359552 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359561 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359569 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359578 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359587 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359629 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359643 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359653 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359665 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359677 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359688 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359699 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359710 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359720 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359733 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359744 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359756 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359767 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359780 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359791 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359801 4743 feature_gate.go:330] unrecognized feature gate: Example Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359812 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359822 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359836 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359847 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359858 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359869 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359880 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359890 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359900 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359911 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359922 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359938 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359955 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.359967 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361084 4743 flags.go:64] FLAG: --address="0.0.0.0" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361121 4743 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361143 4743 flags.go:64] FLAG: --anonymous-auth="true" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361156 4743 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361169 4743 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361180 4743 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361194 4743 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361207 4743 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361218 4743 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361229 4743 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361241 4743 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361255 4743 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361265 4743 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361276 4743 flags.go:64] FLAG: --cgroup-root="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361286 4743 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361296 4743 flags.go:64] FLAG: --client-ca-file="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361305 4743 flags.go:64] FLAG: --cloud-config="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361315 4743 flags.go:64] FLAG: --cloud-provider="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361325 4743 flags.go:64] FLAG: --cluster-dns="[]" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361338 4743 flags.go:64] FLAG: --cluster-domain="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361348 4743 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361358 4743 flags.go:64] FLAG: --config-dir="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361368 4743 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361379 4743 flags.go:64] FLAG: --container-log-max-files="5" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361392 4743 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361404 4743 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361418 4743 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361431 4743 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361445 4743 flags.go:64] FLAG: --contention-profiling="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361458 4743 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361470 4743 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361481 4743 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361491 4743 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361505 4743 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361515 4743 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361525 4743 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361535 4743 flags.go:64] FLAG: --enable-load-reader="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361545 4743 flags.go:64] FLAG: --enable-server="true" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361555 4743 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361570 4743 flags.go:64] FLAG: --event-burst="100" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361583 4743 flags.go:64] FLAG: --event-qps="50" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361626 4743 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361638 4743 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361649 4743 flags.go:64] FLAG: --eviction-hard="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361663 4743 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361673 4743 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361683 4743 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361713 4743 flags.go:64] FLAG: --eviction-soft="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361724 4743 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361735 4743 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361745 4743 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361755 4743 flags.go:64] FLAG: --experimental-mounter-path="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361765 4743 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361775 4743 flags.go:64] FLAG: --fail-swap-on="true" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361785 4743 flags.go:64] FLAG: --feature-gates="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361797 4743 flags.go:64] FLAG: --file-check-frequency="20s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361809 4743 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361819 4743 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361829 4743 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361839 4743 flags.go:64] FLAG: --healthz-port="10248" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361850 4743 flags.go:64] FLAG: --help="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361860 4743 flags.go:64] FLAG: --hostname-override="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361869 4743 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361879 4743 flags.go:64] FLAG: --http-check-frequency="20s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361889 4743 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361899 4743 flags.go:64] FLAG: --image-credential-provider-config="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361909 4743 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361919 4743 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361929 4743 flags.go:64] FLAG: --image-service-endpoint="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361939 4743 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361949 4743 flags.go:64] FLAG: --kube-api-burst="100" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361959 4743 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361971 4743 flags.go:64] FLAG: --kube-api-qps="50" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361981 4743 flags.go:64] FLAG: --kube-reserved="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.361991 4743 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362001 4743 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362011 4743 flags.go:64] FLAG: --kubelet-cgroups="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362020 4743 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362032 4743 flags.go:64] FLAG: --lock-file="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362041 4743 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362051 4743 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362061 4743 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362076 4743 flags.go:64] FLAG: --log-json-split-stream="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362100 4743 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362111 4743 flags.go:64] FLAG: --log-text-split-stream="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362120 4743 flags.go:64] FLAG: --logging-format="text" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362130 4743 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362140 4743 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362150 4743 flags.go:64] FLAG: --manifest-url="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362160 4743 flags.go:64] FLAG: --manifest-url-header="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362173 4743 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362185 4743 flags.go:64] FLAG: --max-open-files="1000000" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362197 4743 flags.go:64] FLAG: --max-pods="110" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362207 4743 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362217 4743 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362227 4743 flags.go:64] FLAG: --memory-manager-policy="None" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362238 4743 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362248 4743 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362259 4743 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362270 4743 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362294 4743 flags.go:64] FLAG: --node-status-max-images="50" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362306 4743 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362316 4743 flags.go:64] FLAG: --oom-score-adj="-999" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362326 4743 flags.go:64] FLAG: --pod-cidr="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362336 4743 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362350 4743 flags.go:64] FLAG: --pod-manifest-path="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362360 4743 flags.go:64] FLAG: --pod-max-pids="-1" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362370 4743 flags.go:64] FLAG: --pods-per-core="0" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362380 4743 flags.go:64] FLAG: --port="10250" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362390 4743 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362401 4743 flags.go:64] FLAG: --provider-id="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362414 4743 flags.go:64] FLAG: --qos-reserved="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362426 4743 flags.go:64] FLAG: --read-only-port="10255" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362439 4743 flags.go:64] FLAG: --register-node="true" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362451 4743 flags.go:64] FLAG: --register-schedulable="true" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362463 4743 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362481 4743 flags.go:64] FLAG: --registry-burst="10" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362491 4743 flags.go:64] FLAG: --registry-qps="5" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362501 4743 flags.go:64] FLAG: --reserved-cpus="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362514 4743 flags.go:64] FLAG: --reserved-memory="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362527 4743 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362537 4743 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362547 4743 flags.go:64] FLAG: --rotate-certificates="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362558 4743 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362568 4743 flags.go:64] FLAG: --runonce="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362577 4743 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362587 4743 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362627 4743 flags.go:64] FLAG: --seccomp-default="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362639 4743 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362649 4743 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362659 4743 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362669 4743 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362679 4743 flags.go:64] FLAG: --storage-driver-password="root" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362690 4743 flags.go:64] FLAG: --storage-driver-secure="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362699 4743 flags.go:64] FLAG: --storage-driver-table="stats" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362709 4743 flags.go:64] FLAG: --storage-driver-user="root" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362718 4743 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362729 4743 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362739 4743 flags.go:64] FLAG: --system-cgroups="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362786 4743 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362802 4743 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362812 4743 flags.go:64] FLAG: --tls-cert-file="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362822 4743 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362835 4743 flags.go:64] FLAG: --tls-min-version="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362845 4743 flags.go:64] FLAG: --tls-private-key-file="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362855 4743 flags.go:64] FLAG: --topology-manager-policy="none" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362864 4743 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362874 4743 flags.go:64] FLAG: --topology-manager-scope="container" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362885 4743 flags.go:64] FLAG: --v="2" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362898 4743 flags.go:64] FLAG: --version="false" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362913 4743 flags.go:64] FLAG: --vmodule="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362928 4743 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.362939 4743 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363194 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363211 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363225 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363241 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363255 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363268 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363281 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363292 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363303 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363314 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363325 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363339 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363353 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363367 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363379 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363390 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363402 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363412 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363424 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363435 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363446 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363457 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363467 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363477 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363487 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363496 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363506 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363515 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363524 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363532 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363541 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363550 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363558 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363566 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363574 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363582 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363624 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363633 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363644 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363653 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363662 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363670 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363679 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363688 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363696 4743 feature_gate.go:330] unrecognized feature gate: Example Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363705 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363714 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363723 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363731 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363740 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363751 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363765 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363777 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363788 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363800 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363811 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363823 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363835 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363846 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363858 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363869 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363880 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363891 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363902 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363912 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363920 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363929 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363937 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363946 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363956 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.363970 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.364007 4743 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.377739 4743 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.377792 4743 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.377939 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.377961 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.377972 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.377982 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.377991 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.377999 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378007 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378016 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378025 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378033 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378041 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378050 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378058 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378067 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378074 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378082 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378090 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378098 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378106 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378114 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378122 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378130 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378137 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378145 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378156 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378168 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378178 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378187 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378195 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378207 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378215 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378224 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378233 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378242 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378250 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378258 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378266 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378274 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378282 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378290 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378298 4743 feature_gate.go:330] unrecognized feature gate: Example Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378306 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378314 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378322 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378331 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378339 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378346 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378355 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378362 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378371 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378381 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378392 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378400 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378408 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378417 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378425 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378433 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378441 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378449 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378457 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378465 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378473 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378480 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378491 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378499 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378508 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378516 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378524 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378535 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378544 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378555 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.378570 4743 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378890 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378905 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378913 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378923 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378931 4743 feature_gate.go:330] unrecognized feature gate: Example Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378939 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378947 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378957 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378965 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378973 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378980 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378989 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.378997 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379005 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379013 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379020 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379028 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379035 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379043 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379051 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379059 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379067 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379075 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379083 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379092 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379100 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379108 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379116 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379126 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379146 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379154 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379162 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379170 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379178 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379186 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379195 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379203 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379210 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379218 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379226 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379234 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379242 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379250 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379259 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379269 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379280 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379289 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379297 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379306 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379315 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379323 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379332 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379340 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379348 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379356 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379364 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379372 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379380 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379388 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379396 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379404 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379412 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379420 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379428 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379436 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379445 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379452 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379462 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379471 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379478 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.379488 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.379501 4743 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.379799 4743 server.go:940] "Client rotation is on, will bootstrap in background" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.386838 4743 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.386977 4743 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.388933 4743 server.go:997] "Starting client certificate rotation" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.388978 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.389201 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-15 10:25:19.896889863 +0000 UTC Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.389294 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 474h26m38.507598443s for next certificate rotation Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.437888 4743 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.443672 4743 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.466802 4743 log.go:25] "Validated CRI v1 runtime API" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.567102 4743 log.go:25] "Validated CRI v1 image API" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.569213 4743 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.579971 4743 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-25-15-53-33-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.580080 4743 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.608366 4743 manager.go:217] Machine: {Timestamp:2025-11-25 15:58:41.605683362 +0000 UTC m=+0.727522981 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:b6bba882-710f-4262-b836-cf27dad9acbb BootID:c508e6b4-2850-452f-b81a-6a39b638eedc Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:cf:15:7c Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:cf:15:7c Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:10:06:1e Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b8:31:6a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:3d:92:b8 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:e7:7f:7e Speed:-1 Mtu:1496} {Name:eth10 MacAddress:d6:0f:88:0c:d7:44 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:2e:ae:62:d3:07:2b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.608878 4743 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.609126 4743 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.610723 4743 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.611045 4743 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.611127 4743 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.611494 4743 topology_manager.go:138] "Creating topology manager with none policy" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.611517 4743 container_manager_linux.go:303] "Creating device plugin manager" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.612155 4743 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.612205 4743 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.612478 4743 state_mem.go:36] "Initialized new in-memory state store" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.612656 4743 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.623248 4743 kubelet.go:418] "Attempting to sync node with API server" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.623289 4743 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.623334 4743 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.623358 4743 kubelet.go:324] "Adding apiserver pod source" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.623377 4743 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.629657 4743 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.631247 4743 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.631336 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 25 15:58:41 crc kubenswrapper[4743]: E1125 15:58:41.631490 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.631558 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 25 15:58:41 crc kubenswrapper[4743]: E1125 15:58:41.631982 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.686872 4743 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.690730 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.690779 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.690795 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.690812 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.690835 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.690851 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.690866 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.690888 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.690908 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.690924 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.690949 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.690962 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.694731 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.696000 4743 server.go:1280] "Started kubelet" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.702488 4743 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.702579 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.702563 4743 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.703313 4743 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 25 15:58:41 crc systemd[1]: Started Kubernetes Kubelet. Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.707983 4743 server.go:460] "Adding debug handlers to kubelet server" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.708399 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.708430 4743 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.715496 4743 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.715519 4743 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.715692 4743 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 25 15:58:41 crc kubenswrapper[4743]: E1125 15:58:41.715740 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.715583 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 02:36:19.859115701 +0000 UTC Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.715841 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 778h37m38.143284435s for next certificate rotation Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.717077 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 25 15:58:41 crc kubenswrapper[4743]: E1125 15:58:41.717220 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="200ms" Nov 25 15:58:41 crc kubenswrapper[4743]: E1125 15:58:41.717246 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.717853 4743 factory.go:55] Registering systemd factory Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.717912 4743 factory.go:221] Registration of the systemd container factory successfully Nov 25 15:58:41 crc kubenswrapper[4743]: E1125 15:58:41.716185 4743 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.166:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187b4b29bdcd7acc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-25 15:58:41.695783628 +0000 UTC m=+0.817623257,LastTimestamp:2025-11-25 15:58:41.695783628 +0000 UTC m=+0.817623257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.722981 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723022 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723034 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723044 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723053 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723065 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723074 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723083 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723094 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723104 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723114 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723123 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723133 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723143 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723177 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723187 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723198 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723208 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723218 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723228 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723239 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723247 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723257 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723267 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723279 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723291 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723319 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723331 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723341 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723352 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723362 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723385 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723397 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723406 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723415 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723424 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723436 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723445 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723456 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723489 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723499 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723509 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723518 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723529 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723538 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723549 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723559 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723570 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723579 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723602 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723612 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723623 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723664 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723677 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723689 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723699 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723710 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723721 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723732 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723741 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723752 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723761 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723771 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723779 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723789 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723798 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723807 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723816 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723826 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723835 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723844 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.726779 4743 factory.go:153] Registering CRI-O factory Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.726883 4743 factory.go:221] Registration of the crio container factory successfully Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.727026 4743 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.727107 4743 factory.go:103] Registering Raw factory Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.727175 4743 manager.go:1196] Started watching for new ooms in manager Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.727821 4743 manager.go:319] Starting recovery of all containers Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.723854 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728072 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728141 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728173 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728201 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728232 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728261 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728291 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728320 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728345 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728372 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728397 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728417 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728437 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728462 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728484 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728506 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728526 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728550 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728572 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728627 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728658 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728684 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728713 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728740 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728767 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728793 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728818 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728844 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728873 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728901 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728937 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.728965 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729008 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729046 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729083 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729116 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729147 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729177 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729205 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729235 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729270 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729298 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729323 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729348 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729373 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729399 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729427 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729453 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729483 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729512 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729537 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729564 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729629 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729661 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729691 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729732 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729764 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729793 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729820 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729850 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729887 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729916 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729946 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.729976 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730004 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730038 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730065 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730090 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730115 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730141 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730169 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730197 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730226 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730253 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730278 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730305 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730329 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730357 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730430 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730471 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730499 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730526 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730556 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730582 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730644 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730677 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730712 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730743 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.730774 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.734903 4743 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.734961 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.734981 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735000 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735020 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735039 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735055 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735073 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735086 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735098 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735109 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735123 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735135 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735146 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735159 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735169 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735180 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735190 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735201 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735213 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735225 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735238 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735249 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735261 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735273 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735286 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735300 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735313 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735325 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735336 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735349 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735361 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735375 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735386 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735397 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735410 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735421 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735435 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735447 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735458 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735477 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735488 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735500 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735512 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735523 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735535 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735546 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735557 4743 reconstruct.go:97] "Volume reconstruction finished" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.735567 4743 reconciler.go:26] "Reconciler: start to sync state" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.765306 4743 manager.go:324] Recovery completed Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.771110 4743 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.773559 4743 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.773630 4743 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.773654 4743 kubelet.go:2335] "Starting kubelet main sync loop" Nov 25 15:58:41 crc kubenswrapper[4743]: E1125 15:58:41.773698 4743 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 25 15:58:41 crc kubenswrapper[4743]: W1125 15:58:41.774668 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 25 15:58:41 crc kubenswrapper[4743]: E1125 15:58:41.774720 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.783180 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.785718 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.785774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.785785 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.786582 4743 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.786613 4743 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.786639 4743 state_mem.go:36] "Initialized new in-memory state store" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.815570 4743 policy_none.go:49] "None policy: Start" Nov 25 15:58:41 crc kubenswrapper[4743]: E1125 15:58:41.815878 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.816496 4743 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.816630 4743 state_mem.go:35] "Initializing new in-memory state store" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.868841 4743 manager.go:334] "Starting Device Plugin manager" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.868906 4743 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.868924 4743 server.go:79] "Starting device plugin registration server" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.869376 4743 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.869404 4743 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.869632 4743 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.869728 4743 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.869740 4743 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.873879 4743 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.873937 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.875811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.875841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.875852 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.875984 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.876177 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.876229 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.876781 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.876818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.876869 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.877152 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.877304 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.877339 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.878100 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.878124 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.878135 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.878199 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.878218 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.878228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.878266 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.878299 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.878313 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:41 crc kubenswrapper[4743]: E1125 15:58:41.878418 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.878427 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.878517 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.878558 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.879211 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.879231 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.879254 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.879267 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.879233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.879303 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.879368 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.879539 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.879569 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.879991 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.880012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.880021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.880167 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.880194 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.880789 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.880829 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.880839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.881367 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.881417 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.881427 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:41 crc kubenswrapper[4743]: E1125 15:58:41.918150 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="400ms" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.937709 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.937743 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.937770 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.937787 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.937807 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.937842 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.937904 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.937953 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.937990 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.938015 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.938042 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.938061 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.938082 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.938104 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.938153 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.970557 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.972133 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.972177 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.972192 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:41 crc kubenswrapper[4743]: I1125 15:58:41.972227 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:58:41 crc kubenswrapper[4743]: E1125 15:58:41.972848 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.166:6443: connect: connection refused" node="crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.039693 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.039775 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.039819 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.039861 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.039882 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.039930 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040007 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.039903 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040017 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040071 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040122 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040016 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040175 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040237 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040264 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040293 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040295 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040298 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040371 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040336 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040406 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040438 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040441 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040474 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040537 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040574 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040583 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.040643 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.174002 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.175969 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.176039 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.176066 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.176120 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:58:42 crc kubenswrapper[4743]: E1125 15:58:42.176904 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.166:6443: connect: connection refused" node="crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.211346 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.218064 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.235234 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.247109 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.254616 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:58:42 crc kubenswrapper[4743]: W1125 15:58:42.256182 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-a4c3519683a464f774a43f75d8b4d9eaef9fad30406ec78aca15577da697fcf6 WatchSource:0}: Error finding container a4c3519683a464f774a43f75d8b4d9eaef9fad30406ec78aca15577da697fcf6: Status 404 returned error can't find the container with id a4c3519683a464f774a43f75d8b4d9eaef9fad30406ec78aca15577da697fcf6 Nov 25 15:58:42 crc kubenswrapper[4743]: W1125 15:58:42.256525 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-90546598d5311087258153f140cf325af0c8b9916d2e7625efe2e6c5b360b124 WatchSource:0}: Error finding container 90546598d5311087258153f140cf325af0c8b9916d2e7625efe2e6c5b360b124: Status 404 returned error can't find the container with id 90546598d5311087258153f140cf325af0c8b9916d2e7625efe2e6c5b360b124 Nov 25 15:58:42 crc kubenswrapper[4743]: W1125 15:58:42.260237 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-2981e4312e6406cb20acf1b93f40699d5ed72e44a62efa8cf71e65d637df3cc7 WatchSource:0}: Error finding container 2981e4312e6406cb20acf1b93f40699d5ed72e44a62efa8cf71e65d637df3cc7: Status 404 returned error can't find the container with id 2981e4312e6406cb20acf1b93f40699d5ed72e44a62efa8cf71e65d637df3cc7 Nov 25 15:58:42 crc kubenswrapper[4743]: W1125 15:58:42.273057 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-a4a6f9bd69285cd9d956cbe61e06ab0ed904839dbc5d551ad30065af36bd60bf WatchSource:0}: Error finding container a4a6f9bd69285cd9d956cbe61e06ab0ed904839dbc5d551ad30065af36bd60bf: Status 404 returned error can't find the container with id a4a6f9bd69285cd9d956cbe61e06ab0ed904839dbc5d551ad30065af36bd60bf Nov 25 15:58:42 crc kubenswrapper[4743]: E1125 15:58:42.319953 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="800ms" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.577571 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.579096 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.579143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.579153 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.579372 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:58:42 crc kubenswrapper[4743]: E1125 15:58:42.580001 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.166:6443: connect: connection refused" node="crc" Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.704152 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.779723 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"90546598d5311087258153f140cf325af0c8b9916d2e7625efe2e6c5b360b124"} Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.780924 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a4a6f9bd69285cd9d956cbe61e06ab0ed904839dbc5d551ad30065af36bd60bf"} Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.782385 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5cf86deeafb3cd80f071d3d44830e5289d06a9eec34f85f116bd65740b64cd6b"} Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.785073 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2981e4312e6406cb20acf1b93f40699d5ed72e44a62efa8cf71e65d637df3cc7"} Nov 25 15:58:42 crc kubenswrapper[4743]: I1125 15:58:42.788388 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a4c3519683a464f774a43f75d8b4d9eaef9fad30406ec78aca15577da697fcf6"} Nov 25 15:58:42 crc kubenswrapper[4743]: W1125 15:58:42.803678 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 25 15:58:42 crc kubenswrapper[4743]: E1125 15:58:42.803823 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:58:42 crc kubenswrapper[4743]: W1125 15:58:42.942630 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 25 15:58:42 crc kubenswrapper[4743]: E1125 15:58:42.942727 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:58:42 crc kubenswrapper[4743]: W1125 15:58:42.944201 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 25 15:58:42 crc kubenswrapper[4743]: E1125 15:58:42.944242 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:58:43 crc kubenswrapper[4743]: E1125 15:58:43.121729 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="1.6s" Nov 25 15:58:43 crc kubenswrapper[4743]: W1125 15:58:43.376041 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 25 15:58:43 crc kubenswrapper[4743]: E1125 15:58:43.376170 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.380607 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.383116 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.383160 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.383172 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.383200 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:58:43 crc kubenswrapper[4743]: E1125 15:58:43.383554 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.166:6443: connect: connection refused" node="crc" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.704452 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.793347 4743 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1" exitCode=0 Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.793420 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1"} Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.793655 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.795225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.795255 4743 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0" exitCode=0 Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.795303 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.795314 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.795322 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.795343 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0"} Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.796052 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.796076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.796085 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.798242 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.798229 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12"} Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.798415 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3"} Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.798488 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48"} Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.798556 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b"} Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.799127 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.799183 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.799208 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.801064 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2" exitCode=0 Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.801136 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2"} Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.801265 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.802844 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.802894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.802916 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.803513 4743 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="95c41a1e21cfb871bbbfef7b006ede93c0375bd44e2b84ba274b0a5f9a167abe" exitCode=0 Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.803615 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"95c41a1e21cfb871bbbfef7b006ede93c0375bd44e2b84ba274b0a5f9a167abe"} Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.803633 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.805061 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.805084 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.805094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.806809 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.807716 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.807795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:43 crc kubenswrapper[4743]: I1125 15:58:43.807823 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.704163 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 25 15:58:44 crc kubenswrapper[4743]: E1125 15:58:44.722961 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="3.2s" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.809557 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8"} Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.809629 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079"} Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.809639 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99"} Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.809648 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02"} Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.809656 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a"} Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.809654 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.810556 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.810610 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.810624 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.812930 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2cbb3a6ce8f0ce1dad29b21a3a2f0c057af3d6fe96c024fbb97b4eaa7df4b4a2"} Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.812975 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.813686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.813720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.813734 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.814892 4743 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802" exitCode=0 Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.814961 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802"} Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.814983 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.815832 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.815863 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.815871 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.817746 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"136e156bb873571a110dbc5090cfcbd1c0eeec8af8f39bb8aea7f0bb369b6389"} Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.817793 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a128eb053d966105d31ed7d9af2d2306eb60abd8f0aa8924e45156535d5a7ada"} Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.817806 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.817828 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.817809 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8dbe4f4383f97c10cebc1610dcb8cfada03fef270e471640d8efdfadeed821c0"} Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.818677 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.818700 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.818709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.818779 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.818817 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.818828 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.984608 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.985713 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.985748 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.985756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:44 crc kubenswrapper[4743]: I1125 15:58:44.985779 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:58:44 crc kubenswrapper[4743]: E1125 15:58:44.986255 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.166:6443: connect: connection refused" node="crc" Nov 25 15:58:45 crc kubenswrapper[4743]: W1125 15:58:45.097811 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.166:6443: connect: connection refused Nov 25 15:58:45 crc kubenswrapper[4743]: E1125 15:58:45.097897 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.166:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.384460 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.723904 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.822117 4743 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98" exitCode=0 Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.822254 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.822287 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.822292 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98"} Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.822329 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.822337 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.822386 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.822254 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.822348 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.823416 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.823467 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.823487 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.823622 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.823653 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.823721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.823839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.823881 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.823897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.823937 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.823902 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.824032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.824036 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.824052 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.824115 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.961542 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:58:45 crc kubenswrapper[4743]: I1125 15:58:45.974132 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.285436 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.708524 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.833957 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610"} Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.834015 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e"} Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.834029 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965"} Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.834042 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df"} Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.834053 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c"} Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.834058 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.834065 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.834184 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.835354 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.835380 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.835354 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.835388 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.835403 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.835420 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.835244 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.835927 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:46 crc kubenswrapper[4743]: I1125 15:58:46.835949 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:47 crc kubenswrapper[4743]: I1125 15:58:47.836777 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:47 crc kubenswrapper[4743]: I1125 15:58:47.836807 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:47 crc kubenswrapper[4743]: I1125 15:58:47.836781 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:47 crc kubenswrapper[4743]: I1125 15:58:47.838146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:47 crc kubenswrapper[4743]: I1125 15:58:47.838174 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:47 crc kubenswrapper[4743]: I1125 15:58:47.838183 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:47 crc kubenswrapper[4743]: I1125 15:58:47.838150 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:47 crc kubenswrapper[4743]: I1125 15:58:47.838223 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:47 crc kubenswrapper[4743]: I1125 15:58:47.838239 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:47 crc kubenswrapper[4743]: I1125 15:58:47.839823 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:47 crc kubenswrapper[4743]: I1125 15:58:47.839846 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:47 crc kubenswrapper[4743]: I1125 15:58:47.839856 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.009814 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.010168 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.012155 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.012209 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.012227 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.145063 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.187289 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.189495 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.189548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.189565 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.189641 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.246022 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.385013 4743 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.385122 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.840146 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.840146 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.841376 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.841419 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.841430 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.841504 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.841527 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:48 crc kubenswrapper[4743]: I1125 15:58:48.841538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:50 crc kubenswrapper[4743]: I1125 15:58:50.271642 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 25 15:58:50 crc kubenswrapper[4743]: I1125 15:58:50.271844 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:50 crc kubenswrapper[4743]: I1125 15:58:50.273043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:50 crc kubenswrapper[4743]: I1125 15:58:50.273078 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:50 crc kubenswrapper[4743]: I1125 15:58:50.273090 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:50 crc kubenswrapper[4743]: I1125 15:58:50.488656 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:58:50 crc kubenswrapper[4743]: I1125 15:58:50.488832 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:50 crc kubenswrapper[4743]: I1125 15:58:50.490148 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:50 crc kubenswrapper[4743]: I1125 15:58:50.490181 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:50 crc kubenswrapper[4743]: I1125 15:58:50.490193 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:51 crc kubenswrapper[4743]: E1125 15:58:51.878574 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 25 15:58:55 crc kubenswrapper[4743]: I1125 15:58:55.320704 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 25 15:58:55 crc kubenswrapper[4743]: I1125 15:58:55.321129 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 25 15:58:55 crc kubenswrapper[4743]: I1125 15:58:55.326126 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 25 15:58:55 crc kubenswrapper[4743]: I1125 15:58:55.326185 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 25 15:58:55 crc kubenswrapper[4743]: I1125 15:58:55.729106 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:58:55 crc kubenswrapper[4743]: I1125 15:58:55.729240 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:58:55 crc kubenswrapper[4743]: I1125 15:58:55.733493 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:58:55 crc kubenswrapper[4743]: I1125 15:58:55.733548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:58:55 crc kubenswrapper[4743]: I1125 15:58:55.733559 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:58:56 crc kubenswrapper[4743]: I1125 15:58:56.305365 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]log ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]etcd ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/openshift.io-api-request-count-filter ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/openshift.io-startkubeinformers ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/generic-apiserver-start-informers ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/priority-and-fairness-config-consumer ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/priority-and-fairness-filter ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/start-apiextensions-informers ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/start-apiextensions-controllers ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/crd-informer-synced ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/start-system-namespaces-controller ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/start-cluster-authentication-info-controller ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/start-legacy-token-tracking-controller ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/start-service-ip-repair-controllers ok Nov 25 15:58:56 crc kubenswrapper[4743]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Nov 25 15:58:56 crc kubenswrapper[4743]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/priority-and-fairness-config-producer ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/bootstrap-controller ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/start-kube-aggregator-informers ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/apiservice-status-local-available-controller ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/apiservice-status-remote-available-controller ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/apiservice-registration-controller ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/apiservice-wait-for-first-sync ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/apiservice-discovery-controller ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/kube-apiserver-autoregistration ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]autoregister-completion ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/apiservice-openapi-controller ok Nov 25 15:58:56 crc kubenswrapper[4743]: [+]poststarthook/apiservice-openapiv3-controller ok Nov 25 15:58:56 crc kubenswrapper[4743]: livez check failed Nov 25 15:58:56 crc kubenswrapper[4743]: I1125 15:58:56.305514 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:58:58 crc kubenswrapper[4743]: I1125 15:58:58.386234 4743 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 25 15:58:58 crc kubenswrapper[4743]: I1125 15:58:58.386298 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.298905 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.299112 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.300173 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.300205 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.300213 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.315100 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.315242 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.316389 4743 trace.go:236] Trace[227022814]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 15:58:46.102) (total time: 14213ms): Nov 25 15:59:00 crc kubenswrapper[4743]: Trace[227022814]: ---"Objects listed" error: 14213ms (15:59:00.316) Nov 25 15:59:00 crc kubenswrapper[4743]: Trace[227022814]: [14.21362494s] [14.21362494s] END Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.316409 4743 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.319549 4743 trace.go:236] Trace[1766269367]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 15:58:45.601) (total time: 14717ms): Nov 25 15:59:00 crc kubenswrapper[4743]: Trace[1766269367]: ---"Objects listed" error: 14717ms (15:59:00.319) Nov 25 15:59:00 crc kubenswrapper[4743]: Trace[1766269367]: [14.717706759s] [14.717706759s] END Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.319583 4743 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.319826 4743 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.319869 4743 trace.go:236] Trace[668056161]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 15:58:45.748) (total time: 14571ms): Nov 25 15:59:00 crc kubenswrapper[4743]: Trace[668056161]: ---"Objects listed" error: 14571ms (15:59:00.319) Nov 25 15:59:00 crc kubenswrapper[4743]: Trace[668056161]: [14.571688182s] [14.571688182s] END Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.319917 4743 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.321490 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.323125 4743 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.352837 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45984->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.352863 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45980->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.352932 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45984->192.168.126.11:17697: read: connection reset by peer" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.352951 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45980->192.168.126.11:17697: read: connection reset by peer" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.633266 4743 apiserver.go:52] "Watching apiserver" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.637469 4743 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.637885 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.638346 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.638423 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.638499 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.639023 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.639166 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.639235 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.639409 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.639458 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.639684 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.640882 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.640943 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.641399 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.641461 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.642053 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.644019 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.645521 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.645633 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.645633 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.671678 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.684914 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.698144 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.708253 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.716776 4743 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.719867 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.721706 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.721736 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.721756 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.721773 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.721788 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.721802 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.721818 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.722185 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.722917 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723001 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723022 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723042 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723072 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723107 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.722280 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723164 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.723219 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:59:01.223189805 +0000 UTC m=+20.345029354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723366 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723420 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723514 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723442 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723556 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723461 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723619 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723644 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723681 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723713 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723773 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723744 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723829 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.723958 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.724039 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.724192 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.724263 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.724359 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.724443 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.724576 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.724685 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.724728 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.724784 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.724795 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.724815 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.724762 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.724894 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725008 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725091 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725134 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725142 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725147 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725165 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725356 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725425 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725464 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725444 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725405 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725630 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725664 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725694 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725699 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725746 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725772 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725822 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725849 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725857 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725985 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726120 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.725981 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726233 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726265 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726187 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726303 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726032 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726043 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726331 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726351 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726388 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726403 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726446 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726574 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726691 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726762 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726515 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726954 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.726442 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.727095 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.727106 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.727124 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.728636 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.728651 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.728737 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.728777 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.728839 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.728877 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.728840 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.728927 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.728887 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.728963 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.728983 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.728999 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.729103 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.729124 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.729138 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.729170 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.729197 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.729227 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.729227 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.729261 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.729277 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.729509 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.729603 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.729669 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.729734 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730431 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730475 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730489 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730503 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.729742 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730553 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730568 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730620 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730647 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730672 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730737 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730764 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730793 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730800 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730819 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730847 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730857 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730874 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.730971 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731001 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731024 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731050 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731073 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731096 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731118 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731140 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731161 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731187 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731208 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731230 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731254 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731278 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731302 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731315 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731323 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731329 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731353 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731432 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731457 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731486 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731510 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731491 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731530 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731551 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731569 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731583 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731619 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731708 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731742 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731780 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731806 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731832 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731857 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731881 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731906 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731932 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731959 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731985 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732011 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732035 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732058 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732085 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732111 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732134 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732156 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732179 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732202 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732224 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732249 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732279 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732305 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732331 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732365 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732392 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732420 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732445 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732469 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732493 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732518 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732544 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732613 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732642 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732670 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732696 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732722 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732747 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732770 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732792 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732821 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732851 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731661 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731773 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732878 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732908 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732937 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732963 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732990 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733044 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733070 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733098 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733125 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733150 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733180 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733204 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733232 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733258 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733284 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733309 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733335 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733362 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733390 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733416 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733444 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733468 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733492 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733518 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733544 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733578 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733639 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733666 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733691 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733715 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733743 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733769 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733796 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733823 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733849 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733874 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733902 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733929 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733956 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733980 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734008 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734035 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734060 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734088 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734115 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734144 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734172 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734199 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734229 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734256 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734283 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734310 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734335 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734361 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734384 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734410 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734461 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734491 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734522 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734554 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734604 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734634 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734668 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734695 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734723 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734756 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734811 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734856 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734883 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734913 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735062 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735081 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735097 4743 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735112 4743 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735127 4743 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735143 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735158 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735173 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735188 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735203 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735217 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735231 4743 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735244 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735258 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735272 4743 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735286 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735301 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735315 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735331 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735346 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735359 4743 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735374 4743 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735390 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735403 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735418 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735432 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735446 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735460 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735474 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735488 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735501 4743 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735515 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735528 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735541 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735555 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735569 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735582 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735615 4743 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735629 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735643 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735660 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735673 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735687 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735702 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735718 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735732 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735746 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735759 4743 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735771 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735787 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735800 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735815 4743 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735828 4743 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735842 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735855 4743 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735869 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735883 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735898 4743 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735914 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735927 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735941 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735954 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735967 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735980 4743 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735993 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.736403 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.741584 4743 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.743761 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.744503 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.746663 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731896 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.731903 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732247 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732068 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732330 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732545 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732667 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732324 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732876 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.732885 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733074 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733348 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.733940 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734215 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734420 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734766 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734783 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.734964 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735080 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735246 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735289 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735319 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.735408 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.736345 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.736655 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.736896 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.737268 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.737286 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.737332 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.737516 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.737680 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.737923 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.738217 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.738392 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.749431 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.738453 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.738861 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.738978 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.739225 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.739376 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.739551 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.739889 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.740111 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.740262 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.740329 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.740357 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.740415 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.740558 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.740659 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.741142 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.740921 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.741255 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.741949 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.742443 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.742748 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.742827 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.742853 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.743030 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.743111 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.743170 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.743279 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.743285 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.743324 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.743235 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.743701 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.743865 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.744318 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.744370 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.744476 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.744564 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.744717 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.744760 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.744746 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.744823 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.744846 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.745092 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.745205 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.745234 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.745358 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.745444 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.745629 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.745755 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.745937 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.745962 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.746088 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.746327 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.746348 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.746481 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.747328 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.747785 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.748030 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.748432 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.748499 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.749182 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.749788 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.749880 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.749923 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.750118 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.750372 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.750419 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.750530 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.750664 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.750980 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.751287 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.751428 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.751532 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.751751 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.751967 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.752101 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:01.252084072 +0000 UTC m=+20.373923621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.752243 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:01.252217997 +0000 UTC m=+20.374057546 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.752291 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.752431 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.753066 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.753087 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.753101 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.753169 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:01.253157179 +0000 UTC m=+20.374996828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.753705 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.753831 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.754476 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.754684 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.754899 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.754938 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.755153 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.755030 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.755410 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.755520 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.755639 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.755797 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.756663 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.756786 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.756916 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.757037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.757139 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.757491 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.761382 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.762138 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.764099 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.764133 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.764146 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:00 crc kubenswrapper[4743]: E1125 15:59:00.764197 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:01.264177002 +0000 UTC m=+20.386016661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.765632 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.767559 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.771297 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.773901 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.784914 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.786346 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.787699 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.836920 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.836999 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837049 4743 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837061 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837072 4743 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837080 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837090 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837100 4743 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837109 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837117 4743 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837124 4743 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837132 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837140 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837147 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837156 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837164 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837171 4743 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837184 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837193 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837201 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837209 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837217 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837225 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837233 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837241 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837251 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837258 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837265 4743 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837276 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837284 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837292 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837300 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837308 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837317 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837326 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837334 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837378 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837412 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837527 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837637 4743 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837657 4743 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837671 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837670 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837684 4743 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837779 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837792 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837803 4743 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837813 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837823 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837832 4743 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837842 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837850 4743 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837860 4743 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837869 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837880 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837891 4743 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837899 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837908 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837916 4743 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837924 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837932 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837940 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837948 4743 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837957 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837965 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837973 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837981 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.837992 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838003 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838011 4743 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838018 4743 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838028 4743 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838036 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838044 4743 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838053 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838070 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838085 4743 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838094 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838103 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838111 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838119 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838127 4743 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838135 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838143 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838153 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838161 4743 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838170 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838178 4743 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838186 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838194 4743 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838203 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838210 4743 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838218 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838226 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838234 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838242 4743 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838251 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838259 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838267 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838276 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838284 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838292 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838301 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838310 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838318 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838326 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838335 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838346 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838354 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838362 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838370 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838378 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838387 4743 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838395 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838403 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838412 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838421 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838430 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838438 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838447 4743 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838455 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838463 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838472 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838480 4743 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838488 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838496 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838505 4743 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838513 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838521 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838529 4743 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838538 4743 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838547 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.838555 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.876153 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.881137 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8" exitCode=255 Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.881217 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8"} Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.895055 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.895362 4743 scope.go:117] "RemoveContainer" containerID="84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.895762 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.900746 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.909733 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.924856 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.935643 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.950424 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.953231 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.959471 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.965805 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:00 crc kubenswrapper[4743]: W1125 15:59:00.966112 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-0297e06c3c2254835407dd870824c20d2c809f248136201023c6c2e3737b76f5 WatchSource:0}: Error finding container 0297e06c3c2254835407dd870824c20d2c809f248136201023c6c2e3737b76f5: Status 404 returned error can't find the container with id 0297e06c3c2254835407dd870824c20d2c809f248136201023c6c2e3737b76f5 Nov 25 15:59:00 crc kubenswrapper[4743]: I1125 15:59:00.968953 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:59:00 crc kubenswrapper[4743]: W1125 15:59:00.969723 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-cb3675e220c219f1f7d8297abc622cc29697d5b2a474b53d08653ec66806e532 WatchSource:0}: Error finding container cb3675e220c219f1f7d8297abc622cc29697d5b2a474b53d08653ec66806e532: Status 404 returned error can't find the container with id cb3675e220c219f1f7d8297abc622cc29697d5b2a474b53d08653ec66806e532 Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.241903 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:59:01 crc kubenswrapper[4743]: E1125 15:59:01.242178 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:59:02.242157642 +0000 UTC m=+21.363997191 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.291167 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.311103 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.322091 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.337083 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.343356 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.343431 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.343458 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.343488 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:01 crc kubenswrapper[4743]: E1125 15:59:01.343612 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:59:01 crc kubenswrapper[4743]: E1125 15:59:01.343666 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:02.343648456 +0000 UTC m=+21.465488005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:59:01 crc kubenswrapper[4743]: E1125 15:59:01.343992 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:59:01 crc kubenswrapper[4743]: E1125 15:59:01.344035 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:02.34402429 +0000 UTC m=+21.465863839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:59:01 crc kubenswrapper[4743]: E1125 15:59:01.344106 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:59:01 crc kubenswrapper[4743]: E1125 15:59:01.344127 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:59:01 crc kubenswrapper[4743]: E1125 15:59:01.344140 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:01 crc kubenswrapper[4743]: E1125 15:59:01.344169 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:02.344159844 +0000 UTC m=+21.465999393 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:01 crc kubenswrapper[4743]: E1125 15:59:01.344224 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:59:01 crc kubenswrapper[4743]: E1125 15:59:01.344236 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:59:01 crc kubenswrapper[4743]: E1125 15:59:01.344244 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:01 crc kubenswrapper[4743]: E1125 15:59:01.344273 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:02.344263338 +0000 UTC m=+21.466102887 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.352098 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.371934 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.385103 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.396241 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.407541 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.774322 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.774444 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:01 crc kubenswrapper[4743]: E1125 15:59:01.774455 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:01 crc kubenswrapper[4743]: E1125 15:59:01.774657 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.782944 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.783475 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.784781 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.785371 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.786534 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.787030 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.787582 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.788497 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.789151 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.790040 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.790506 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.791626 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.792083 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.792614 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.794237 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.794827 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.795726 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.796112 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.796761 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.796779 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:01Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.797893 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.798319 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.799239 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.799724 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.800744 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.801192 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.801773 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.802772 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.803217 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.804106 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.804553 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.805368 4743 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.805466 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.807109 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.807933 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.808313 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.809931 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.810564 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.810645 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:01Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.811393 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.812011 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.813060 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.813497 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.814547 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.815204 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.816203 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.816712 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.817581 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.818140 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.820099 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.821287 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.822143 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.822867 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.823371 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:01Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.823702 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.824613 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.825383 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.841224 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:01Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.855993 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:01Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.870819 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:01Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.884278 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce"} Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.884368 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0297e06c3c2254835407dd870824c20d2c809f248136201023c6c2e3737b76f5"} Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.887193 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.889329 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240"} Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.889575 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.890481 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7a6decec36724cb8f4ca1d9ed40b57233fea04c0f0b412e27cdf9cbb01bf91cb"} Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.892019 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1"} Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.892071 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3"} Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.892102 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cb3675e220c219f1f7d8297abc622cc29697d5b2a474b53d08653ec66806e532"} Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.894051 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.896791 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:01Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.920310 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:01Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.946714 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:01Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.961896 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:01Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.977106 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:01Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:01 crc kubenswrapper[4743]: I1125 15:59:01.992503 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:01Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:02 crc kubenswrapper[4743]: I1125 15:59:02.008811 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:02Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:02 crc kubenswrapper[4743]: I1125 15:59:02.022655 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:02Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:02 crc kubenswrapper[4743]: I1125 15:59:02.038761 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:02Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:02 crc kubenswrapper[4743]: I1125 15:59:02.052583 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:02Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:02 crc kubenswrapper[4743]: I1125 15:59:02.252171 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:59:02 crc kubenswrapper[4743]: E1125 15:59:02.252467 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:59:04.252416021 +0000 UTC m=+23.374255620 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:59:02 crc kubenswrapper[4743]: I1125 15:59:02.354082 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:02 crc kubenswrapper[4743]: I1125 15:59:02.354168 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:02 crc kubenswrapper[4743]: I1125 15:59:02.354220 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:02 crc kubenswrapper[4743]: E1125 15:59:02.354261 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:59:02 crc kubenswrapper[4743]: I1125 15:59:02.354268 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:02 crc kubenswrapper[4743]: E1125 15:59:02.354362 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:04.354337131 +0000 UTC m=+23.476176770 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:59:02 crc kubenswrapper[4743]: E1125 15:59:02.354442 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:59:02 crc kubenswrapper[4743]: E1125 15:59:02.354449 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:59:02 crc kubenswrapper[4743]: E1125 15:59:02.354505 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:59:02 crc kubenswrapper[4743]: E1125 15:59:02.354468 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:59:02 crc kubenswrapper[4743]: E1125 15:59:02.354539 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:59:02 crc kubenswrapper[4743]: E1125 15:59:02.354542 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:04.354512917 +0000 UTC m=+23.476352496 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:59:02 crc kubenswrapper[4743]: E1125 15:59:02.354557 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:02 crc kubenswrapper[4743]: E1125 15:59:02.354519 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:02 crc kubenswrapper[4743]: E1125 15:59:02.354622 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:04.35460623 +0000 UTC m=+23.476446009 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:02 crc kubenswrapper[4743]: E1125 15:59:02.354663 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:04.354646222 +0000 UTC m=+23.476485991 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:02 crc kubenswrapper[4743]: I1125 15:59:02.773888 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:02 crc kubenswrapper[4743]: E1125 15:59:02.774036 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:03 crc kubenswrapper[4743]: I1125 15:59:03.774835 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:03 crc kubenswrapper[4743]: I1125 15:59:03.774928 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:03 crc kubenswrapper[4743]: E1125 15:59:03.775018 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:03 crc kubenswrapper[4743]: E1125 15:59:03.775070 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:03 crc kubenswrapper[4743]: I1125 15:59:03.898030 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5"} Nov 25 15:59:03 crc kubenswrapper[4743]: I1125 15:59:03.915673 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:03Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:03 crc kubenswrapper[4743]: I1125 15:59:03.938905 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:03Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:03 crc kubenswrapper[4743]: I1125 15:59:03.954274 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:03Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:03 crc kubenswrapper[4743]: I1125 15:59:03.967349 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:03Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:03 crc kubenswrapper[4743]: I1125 15:59:03.982120 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:03Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:03 crc kubenswrapper[4743]: I1125 15:59:03.995805 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:03Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.008437 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.020956 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.271456 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:59:04 crc kubenswrapper[4743]: E1125 15:59:04.271682 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:59:08.271655526 +0000 UTC m=+27.393495075 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.371927 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.371983 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.372022 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.372053 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:04 crc kubenswrapper[4743]: E1125 15:59:04.372098 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:59:04 crc kubenswrapper[4743]: E1125 15:59:04.372144 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:59:04 crc kubenswrapper[4743]: E1125 15:59:04.372165 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:59:04 crc kubenswrapper[4743]: E1125 15:59:04.372190 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:08.372172177 +0000 UTC m=+27.494011726 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:59:04 crc kubenswrapper[4743]: E1125 15:59:04.372190 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:59:04 crc kubenswrapper[4743]: E1125 15:59:04.372195 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:59:04 crc kubenswrapper[4743]: E1125 15:59:04.372236 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:59:04 crc kubenswrapper[4743]: E1125 15:59:04.372250 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:04 crc kubenswrapper[4743]: E1125 15:59:04.372211 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:08.372202078 +0000 UTC m=+27.494041737 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:59:04 crc kubenswrapper[4743]: E1125 15:59:04.372315 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:08.372306342 +0000 UTC m=+27.494145891 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:04 crc kubenswrapper[4743]: E1125 15:59:04.372214 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:04 crc kubenswrapper[4743]: E1125 15:59:04.372385 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:08.372365714 +0000 UTC m=+27.494205343 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.655970 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-6xggw"] Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.656307 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6xggw" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.658234 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.659290 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.659422 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.671294 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.688924 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.702693 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.720382 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.774882 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.774952 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/78bac9e6-44dd-4270-9183-774823a568a8-hosts-file\") pod \"node-resolver-6xggw\" (UID: \"78bac9e6-44dd-4270-9183-774823a568a8\") " pod="openshift-dns/node-resolver-6xggw" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.774987 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htr6r\" (UniqueName: \"kubernetes.io/projected/78bac9e6-44dd-4270-9183-774823a568a8-kube-api-access-htr6r\") pod \"node-resolver-6xggw\" (UID: \"78bac9e6-44dd-4270-9183-774823a568a8\") " pod="openshift-dns/node-resolver-6xggw" Nov 25 15:59:04 crc kubenswrapper[4743]: E1125 15:59:04.775023 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.778060 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.811238 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.844792 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.865678 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.875644 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htr6r\" (UniqueName: \"kubernetes.io/projected/78bac9e6-44dd-4270-9183-774823a568a8-kube-api-access-htr6r\") pod \"node-resolver-6xggw\" (UID: \"78bac9e6-44dd-4270-9183-774823a568a8\") " pod="openshift-dns/node-resolver-6xggw" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.875707 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/78bac9e6-44dd-4270-9183-774823a568a8-hosts-file\") pod \"node-resolver-6xggw\" (UID: \"78bac9e6-44dd-4270-9183-774823a568a8\") " pod="openshift-dns/node-resolver-6xggw" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.875783 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/78bac9e6-44dd-4270-9183-774823a568a8-hosts-file\") pod \"node-resolver-6xggw\" (UID: \"78bac9e6-44dd-4270-9183-774823a568a8\") " pod="openshift-dns/node-resolver-6xggw" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.891234 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.900322 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htr6r\" (UniqueName: \"kubernetes.io/projected/78bac9e6-44dd-4270-9183-774823a568a8-kube-api-access-htr6r\") pod \"node-resolver-6xggw\" (UID: \"78bac9e6-44dd-4270-9183-774823a568a8\") " pod="openshift-dns/node-resolver-6xggw" Nov 25 15:59:04 crc kubenswrapper[4743]: I1125 15:59:04.967182 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6xggw" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.071473 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-n2r2l"] Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.072125 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.074564 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.075059 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.075095 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.076403 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.076464 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.092477 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.107964 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.121090 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.135208 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.145923 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.158986 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184244 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-cnibin\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184289 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-multus-socket-dir-parent\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184314 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-run-netns\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184346 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-system-cni-dir\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184368 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2175b34c-5202-4e94-af0e-2f879b98c0bc-cni-binary-copy\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184399 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-run-k8s-cni-cncf-io\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184422 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-var-lib-cni-bin\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184443 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-multus-conf-dir\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184463 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-os-release\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184486 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2175b34c-5202-4e94-af0e-2f879b98c0bc-multus-daemon-config\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184507 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-858zh\" (UniqueName: \"kubernetes.io/projected/2175b34c-5202-4e94-af0e-2f879b98c0bc-kube-api-access-858zh\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184531 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-var-lib-kubelet\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184552 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-etc-kubernetes\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184579 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-multus-cni-dir\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184617 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-var-lib-cni-multus\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184638 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-hostroot\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.184659 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-run-multus-certs\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.185929 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.199322 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.214735 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.228062 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.285697 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-var-lib-kubelet\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.285741 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-etc-kubernetes\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.285771 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-multus-cni-dir\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.285787 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-var-lib-cni-multus\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.285802 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-hostroot\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.285841 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-run-multus-certs\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.285863 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-cnibin\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.285880 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-multus-socket-dir-parent\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.285895 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-run-netns\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.285919 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-system-cni-dir\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.285909 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-var-lib-kubelet\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.285935 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2175b34c-5202-4e94-af0e-2f879b98c0bc-cni-binary-copy\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286076 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-run-k8s-cni-cncf-io\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286105 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-os-release\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286131 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-var-lib-cni-bin\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286161 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-multus-conf-dir\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286187 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2175b34c-5202-4e94-af0e-2f879b98c0bc-multus-daemon-config\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286214 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-858zh\" (UniqueName: \"kubernetes.io/projected/2175b34c-5202-4e94-af0e-2f879b98c0bc-kube-api-access-858zh\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286459 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-var-lib-cni-bin\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286531 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-run-k8s-cni-cncf-io\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286585 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-multus-conf-dir\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286612 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-run-multus-certs\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-etc-kubernetes\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286710 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-hostroot\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286720 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-multus-cni-dir\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286746 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-cnibin\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286776 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-run-netns\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286790 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-os-release\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286800 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-multus-socket-dir-parent\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286750 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-host-var-lib-cni-multus\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.286836 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2175b34c-5202-4e94-af0e-2f879b98c0bc-system-cni-dir\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.287159 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2175b34c-5202-4e94-af0e-2f879b98c0bc-multus-daemon-config\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.287218 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2175b34c-5202-4e94-af0e-2f879b98c0bc-cni-binary-copy\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.306532 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-858zh\" (UniqueName: \"kubernetes.io/projected/2175b34c-5202-4e94-af0e-2f879b98c0bc-kube-api-access-858zh\") pod \"multus-n2r2l\" (UID: \"2175b34c-5202-4e94-af0e-2f879b98c0bc\") " pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.383513 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n2r2l" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.392800 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:59:05 crc kubenswrapper[4743]: W1125 15:59:05.398151 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2175b34c_5202_4e94_af0e_2f879b98c0bc.slice/crio-c9d1ba58d3b55e832e6b366c396edc1c22a2869fa2a6e90d01e2717b69a09589 WatchSource:0}: Error finding container c9d1ba58d3b55e832e6b366c396edc1c22a2869fa2a6e90d01e2717b69a09589: Status 404 returned error can't find the container with id c9d1ba58d3b55e832e6b366c396edc1c22a2869fa2a6e90d01e2717b69a09589 Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.406459 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.409833 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.423073 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.455787 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.484430 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-ntxtl"] Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.485325 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.486500 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.490042 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.490454 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.496807 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-f7q7f"] Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.497263 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pbbjc"] Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.498142 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.498680 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.502581 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.502820 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.503056 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.506730 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.506818 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.506840 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.506872 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.506730 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.507226 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.507696 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.508707 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.511668 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.534121 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.565636 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.579005 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.589615 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-slash\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.589650 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c2d6248c-be7e-48f3-b314-6089c361b67a-cnibin\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.589668 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-openvswitch\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.589686 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c2d6248c-be7e-48f3-b314-6089c361b67a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.589706 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.589726 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c2d6248c-be7e-48f3-b314-6089c361b67a-os-release\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.589743 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-ovn\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.589760 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-ovnkube-config\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.589776 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-ovnkube-script-lib\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.589795 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73c29847-f70f-4ab1-9691-685966384446-proxy-tls\") pod \"machine-config-daemon-f7q7f\" (UID: \"73c29847-f70f-4ab1-9691-685966384446\") " pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.589810 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-var-lib-openvswitch\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.589828 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-env-overrides\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590001 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/73c29847-f70f-4ab1-9691-685966384446-mcd-auth-proxy-config\") pod \"machine-config-daemon-f7q7f\" (UID: \"73c29847-f70f-4ab1-9691-685966384446\") " pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590070 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqwtx\" (UniqueName: \"kubernetes.io/projected/c2d6248c-be7e-48f3-b314-6089c361b67a-kube-api-access-cqwtx\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590104 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-systemd-units\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590136 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-systemd\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590161 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-kubelet\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590191 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d04400c3-4f05-4be2-b759-a60cec0746ec-ovn-node-metrics-cert\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590223 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/73c29847-f70f-4ab1-9691-685966384446-rootfs\") pod \"machine-config-daemon-f7q7f\" (UID: \"73c29847-f70f-4ab1-9691-685966384446\") " pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590247 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzb4p\" (UniqueName: \"kubernetes.io/projected/73c29847-f70f-4ab1-9691-685966384446-kube-api-access-pzb4p\") pod \"machine-config-daemon-f7q7f\" (UID: \"73c29847-f70f-4ab1-9691-685966384446\") " pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590280 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c2d6248c-be7e-48f3-b314-6089c361b67a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590309 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-cni-bin\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590333 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2d6248c-be7e-48f3-b314-6089c361b67a-system-cni-dir\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590352 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c2d6248c-be7e-48f3-b314-6089c361b67a-cni-binary-copy\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590373 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-node-log\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590414 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbbps\" (UniqueName: \"kubernetes.io/projected/d04400c3-4f05-4be2-b759-a60cec0746ec-kube-api-access-lbbps\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590442 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-log-socket\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590469 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590515 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-run-netns\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590538 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-cni-netd\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.590927 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-etc-openvswitch\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.591319 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.604084 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.617070 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.631871 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.644321 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.655346 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.667614 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.687950 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693022 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-run-netns\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693099 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-log-socket\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693124 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693154 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-etc-openvswitch\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693177 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-cni-netd\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693216 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-slash\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693241 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c2d6248c-be7e-48f3-b314-6089c361b67a-cnibin\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693262 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-openvswitch\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693298 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c2d6248c-be7e-48f3-b314-6089c361b67a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693322 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693350 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73c29847-f70f-4ab1-9691-685966384446-proxy-tls\") pod \"machine-config-daemon-f7q7f\" (UID: \"73c29847-f70f-4ab1-9691-685966384446\") " pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693370 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c2d6248c-be7e-48f3-b314-6089c361b67a-os-release\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693394 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-ovn\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693412 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-ovnkube-config\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693435 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-ovnkube-script-lib\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693479 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-var-lib-openvswitch\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693497 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-env-overrides\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693525 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/73c29847-f70f-4ab1-9691-685966384446-mcd-auth-proxy-config\") pod \"machine-config-daemon-f7q7f\" (UID: \"73c29847-f70f-4ab1-9691-685966384446\") " pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693551 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqwtx\" (UniqueName: \"kubernetes.io/projected/c2d6248c-be7e-48f3-b314-6089c361b67a-kube-api-access-cqwtx\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693573 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-systemd-units\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693609 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-systemd\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693634 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-kubelet\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693659 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/73c29847-f70f-4ab1-9691-685966384446-rootfs\") pod \"machine-config-daemon-f7q7f\" (UID: \"73c29847-f70f-4ab1-9691-685966384446\") " pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693683 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzb4p\" (UniqueName: \"kubernetes.io/projected/73c29847-f70f-4ab1-9691-685966384446-kube-api-access-pzb4p\") pod \"machine-config-daemon-f7q7f\" (UID: \"73c29847-f70f-4ab1-9691-685966384446\") " pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693713 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d04400c3-4f05-4be2-b759-a60cec0746ec-ovn-node-metrics-cert\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693742 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c2d6248c-be7e-48f3-b314-6089c361b67a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693770 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-cni-bin\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693813 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2d6248c-be7e-48f3-b314-6089c361b67a-system-cni-dir\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693838 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c2d6248c-be7e-48f3-b314-6089c361b67a-cni-binary-copy\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693863 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-node-log\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.693889 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbbps\" (UniqueName: \"kubernetes.io/projected/d04400c3-4f05-4be2-b759-a60cec0746ec-kube-api-access-lbbps\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.694461 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-run-netns\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.694496 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-var-lib-openvswitch\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.694576 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-log-socket\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.694642 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.694685 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-etc-openvswitch\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.694716 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-cni-netd\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.694744 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-slash\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.694776 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c2d6248c-be7e-48f3-b314-6089c361b67a-cnibin\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.694817 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-openvswitch\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.695058 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c2d6248c-be7e-48f3-b314-6089c361b67a-os-release\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.695543 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.695668 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-systemd-units\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.695748 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-systemd\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.695800 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-kubelet\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.695816 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-env-overrides\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.695840 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/73c29847-f70f-4ab1-9691-685966384446-rootfs\") pod \"machine-config-daemon-f7q7f\" (UID: \"73c29847-f70f-4ab1-9691-685966384446\") " pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.695870 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c2d6248c-be7e-48f3-b314-6089c361b67a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.695909 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-cni-bin\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.696066 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c2d6248c-be7e-48f3-b314-6089c361b67a-system-cni-dir\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.696100 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-ovn\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.696139 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c2d6248c-be7e-48f3-b314-6089c361b67a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.696371 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-node-log\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.696394 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/73c29847-f70f-4ab1-9691-685966384446-mcd-auth-proxy-config\") pod \"machine-config-daemon-f7q7f\" (UID: \"73c29847-f70f-4ab1-9691-685966384446\") " pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.696625 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-ovnkube-config\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.696843 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c2d6248c-be7e-48f3-b314-6089c361b67a-cni-binary-copy\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.697125 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-ovnkube-script-lib\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.700670 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73c29847-f70f-4ab1-9691-685966384446-proxy-tls\") pod \"machine-config-daemon-f7q7f\" (UID: \"73c29847-f70f-4ab1-9691-685966384446\") " pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.702279 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d04400c3-4f05-4be2-b759-a60cec0746ec-ovn-node-metrics-cert\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.711749 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqwtx\" (UniqueName: \"kubernetes.io/projected/c2d6248c-be7e-48f3-b314-6089c361b67a-kube-api-access-cqwtx\") pod \"multus-additional-cni-plugins-ntxtl\" (UID: \"c2d6248c-be7e-48f3-b314-6089c361b67a\") " pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.713445 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbbps\" (UniqueName: \"kubernetes.io/projected/d04400c3-4f05-4be2-b759-a60cec0746ec-kube-api-access-lbbps\") pod \"ovnkube-node-pbbjc\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.715786 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzb4p\" (UniqueName: \"kubernetes.io/projected/73c29847-f70f-4ab1-9691-685966384446-kube-api-access-pzb4p\") pod \"machine-config-daemon-f7q7f\" (UID: \"73c29847-f70f-4ab1-9691-685966384446\") " pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.719529 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.733543 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.747485 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.769821 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.774472 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.774529 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:05 crc kubenswrapper[4743]: E1125 15:59:05.774623 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:05 crc kubenswrapper[4743]: E1125 15:59:05.774721 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.782170 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.794958 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.796926 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.809270 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: W1125 15:59:05.809486 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2d6248c_be7e_48f3_b314_6089c361b67a.slice/crio-8b251693bd2e0b5d72f12e711c2282ca2a6aa5aa2df80e330a57dd574423ef42 WatchSource:0}: Error finding container 8b251693bd2e0b5d72f12e711c2282ca2a6aa5aa2df80e330a57dd574423ef42: Status 404 returned error can't find the container with id 8b251693bd2e0b5d72f12e711c2282ca2a6aa5aa2df80e330a57dd574423ef42 Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.811246 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.819778 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.823542 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: W1125 15:59:05.829456 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd04400c3_4f05_4be2_b759_a60cec0746ec.slice/crio-7705045cc01c8d9ca1344b80a19bf76e3c1931e5a42002654041baa614c7e38a WatchSource:0}: Error finding container 7705045cc01c8d9ca1344b80a19bf76e3c1931e5a42002654041baa614c7e38a: Status 404 returned error can't find the container with id 7705045cc01c8d9ca1344b80a19bf76e3c1931e5a42002654041baa614c7e38a Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.836465 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: W1125 15:59:05.841791 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73c29847_f70f_4ab1_9691_685966384446.slice/crio-ec85dccf224b77d82d48550fccb933e76080f30d7f7c97e4fba293a118477d1b WatchSource:0}: Error finding container ec85dccf224b77d82d48550fccb933e76080f30d7f7c97e4fba293a118477d1b: Status 404 returned error can't find the container with id ec85dccf224b77d82d48550fccb933e76080f30d7f7c97e4fba293a118477d1b Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.851253 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.907480 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"ec85dccf224b77d82d48550fccb933e76080f30d7f7c97e4fba293a118477d1b"} Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.908975 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerStarted","Data":"7705045cc01c8d9ca1344b80a19bf76e3c1931e5a42002654041baa614c7e38a"} Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.910746 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n2r2l" event={"ID":"2175b34c-5202-4e94-af0e-2f879b98c0bc","Type":"ContainerStarted","Data":"1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e"} Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.910797 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n2r2l" event={"ID":"2175b34c-5202-4e94-af0e-2f879b98c0bc","Type":"ContainerStarted","Data":"c9d1ba58d3b55e832e6b366c396edc1c22a2869fa2a6e90d01e2717b69a09589"} Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.913023 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" event={"ID":"c2d6248c-be7e-48f3-b314-6089c361b67a","Type":"ContainerStarted","Data":"8b251693bd2e0b5d72f12e711c2282ca2a6aa5aa2df80e330a57dd574423ef42"} Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.916059 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6xggw" event={"ID":"78bac9e6-44dd-4270-9183-774823a568a8","Type":"ContainerStarted","Data":"c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979"} Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.916132 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6xggw" event={"ID":"78bac9e6-44dd-4270-9183-774823a568a8","Type":"ContainerStarted","Data":"d66be8fd07b90836c444a6e449c7eda9c33f8a4594a5d58298e9088ec40a7885"} Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.929275 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.942930 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.962767 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:05 crc kubenswrapper[4743]: I1125 15:59:05.979128 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:05Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.004753 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.020550 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.035873 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.056865 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.075911 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.095436 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.112026 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.126283 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.138933 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.152839 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.166349 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.178136 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.195240 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.212060 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.227442 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.240292 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.255015 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.266453 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.276837 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.290503 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.308531 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.327900 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.340138 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.352951 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.722441 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.723873 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.723926 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.723940 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.724037 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.733923 4743 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.734259 4743 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.735469 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.735516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.735528 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.735553 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.735564 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:06Z","lastTransitionTime":"2025-11-25T15:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:06 crc kubenswrapper[4743]: E1125 15:59:06.751528 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.755073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.755102 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.755111 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.755271 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.755373 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:06Z","lastTransitionTime":"2025-11-25T15:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:06 crc kubenswrapper[4743]: E1125 15:59:06.767894 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.771680 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.771709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.771717 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.771732 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.771742 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:06Z","lastTransitionTime":"2025-11-25T15:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.773994 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:06 crc kubenswrapper[4743]: E1125 15:59:06.774085 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:06 crc kubenswrapper[4743]: E1125 15:59:06.786758 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.791217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.791256 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.791269 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.791286 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.791298 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:06Z","lastTransitionTime":"2025-11-25T15:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:06 crc kubenswrapper[4743]: E1125 15:59:06.804433 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.809084 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.809114 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.809125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.809144 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.809155 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:06Z","lastTransitionTime":"2025-11-25T15:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:06 crc kubenswrapper[4743]: E1125 15:59:06.824100 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: E1125 15:59:06.824281 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.826452 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.826483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.826494 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.826510 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.826519 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:06Z","lastTransitionTime":"2025-11-25T15:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.920816 4743 generic.go:334] "Generic (PLEG): container finished" podID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerID="40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314" exitCode=0 Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.920906 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerDied","Data":"40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314"} Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.923660 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b"} Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.923707 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c"} Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.925540 4743 generic.go:334] "Generic (PLEG): container finished" podID="c2d6248c-be7e-48f3-b314-6089c361b67a" containerID="eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee" exitCode=0 Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.925633 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" event={"ID":"c2d6248c-be7e-48f3-b314-6089c361b67a","Type":"ContainerDied","Data":"eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee"} Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.928009 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.928062 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.928071 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.928090 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.928104 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:06Z","lastTransitionTime":"2025-11-25T15:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.947671 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.961464 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.978487 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:06 crc kubenswrapper[4743]: I1125 15:59:06.996814 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:06Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.015381 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.029566 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.031929 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.031974 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.031984 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.032003 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.032012 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:07Z","lastTransitionTime":"2025-11-25T15:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.045788 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.057665 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.071452 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.085208 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.096976 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.108884 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.125208 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.135271 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.135334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.135347 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.135370 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.135385 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:07Z","lastTransitionTime":"2025-11-25T15:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.140055 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.152280 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.166370 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.185927 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.200098 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.213475 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.227507 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.238732 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.238783 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.238799 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.238819 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.238831 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:07Z","lastTransitionTime":"2025-11-25T15:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.249362 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.269448 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.284888 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.308508 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.337460 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.341874 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.341923 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.341935 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.341953 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.341962 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:07Z","lastTransitionTime":"2025-11-25T15:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.364734 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.387408 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.401391 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.445548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.445656 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.445670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.445696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.445713 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:07Z","lastTransitionTime":"2025-11-25T15:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.548642 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.548682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.548692 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.548706 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.548716 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:07Z","lastTransitionTime":"2025-11-25T15:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.652178 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.652555 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.652566 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.652582 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.652645 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:07Z","lastTransitionTime":"2025-11-25T15:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.718730 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-zxxwm"] Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.719214 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zxxwm" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.721131 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.724420 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.724497 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.724814 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.736363 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.751613 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.755814 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.755877 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.755889 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.755908 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.755921 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:07Z","lastTransitionTime":"2025-11-25T15:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.764584 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.774506 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.774522 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:07 crc kubenswrapper[4743]: E1125 15:59:07.774708 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:07 crc kubenswrapper[4743]: E1125 15:59:07.774789 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.780977 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.795429 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.808163 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.815438 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x5jz\" (UniqueName: \"kubernetes.io/projected/e69c3c02-668d-42ba-9347-e5bea6cdf260-kube-api-access-5x5jz\") pod \"node-ca-zxxwm\" (UID: \"e69c3c02-668d-42ba-9347-e5bea6cdf260\") " pod="openshift-image-registry/node-ca-zxxwm" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.815496 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e69c3c02-668d-42ba-9347-e5bea6cdf260-host\") pod \"node-ca-zxxwm\" (UID: \"e69c3c02-668d-42ba-9347-e5bea6cdf260\") " pod="openshift-image-registry/node-ca-zxxwm" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.815548 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e69c3c02-668d-42ba-9347-e5bea6cdf260-serviceca\") pod \"node-ca-zxxwm\" (UID: \"e69c3c02-668d-42ba-9347-e5bea6cdf260\") " pod="openshift-image-registry/node-ca-zxxwm" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.822210 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.837383 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.849105 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.859145 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.859184 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.859192 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.859207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.859217 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:07Z","lastTransitionTime":"2025-11-25T15:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.863571 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.883854 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.908207 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.916365 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e69c3c02-668d-42ba-9347-e5bea6cdf260-host\") pod \"node-ca-zxxwm\" (UID: \"e69c3c02-668d-42ba-9347-e5bea6cdf260\") " pod="openshift-image-registry/node-ca-zxxwm" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.916404 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x5jz\" (UniqueName: \"kubernetes.io/projected/e69c3c02-668d-42ba-9347-e5bea6cdf260-kube-api-access-5x5jz\") pod \"node-ca-zxxwm\" (UID: \"e69c3c02-668d-42ba-9347-e5bea6cdf260\") " pod="openshift-image-registry/node-ca-zxxwm" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.916434 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e69c3c02-668d-42ba-9347-e5bea6cdf260-serviceca\") pod \"node-ca-zxxwm\" (UID: \"e69c3c02-668d-42ba-9347-e5bea6cdf260\") " pod="openshift-image-registry/node-ca-zxxwm" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.916469 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e69c3c02-668d-42ba-9347-e5bea6cdf260-host\") pod \"node-ca-zxxwm\" (UID: \"e69c3c02-668d-42ba-9347-e5bea6cdf260\") " pod="openshift-image-registry/node-ca-zxxwm" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.918097 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e69c3c02-668d-42ba-9347-e5bea6cdf260-serviceca\") pod \"node-ca-zxxwm\" (UID: \"e69c3c02-668d-42ba-9347-e5bea6cdf260\") " pod="openshift-image-registry/node-ca-zxxwm" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.921695 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.930363 4743 generic.go:334] "Generic (PLEG): container finished" podID="c2d6248c-be7e-48f3-b314-6089c361b67a" containerID="d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d" exitCode=0 Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.930420 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" event={"ID":"c2d6248c-be7e-48f3-b314-6089c361b67a","Type":"ContainerDied","Data":"d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d"} Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.934204 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerStarted","Data":"0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f"} Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.934235 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerStarted","Data":"0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669"} Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.934248 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerStarted","Data":"42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8"} Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.934258 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerStarted","Data":"a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d"} Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.945636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x5jz\" (UniqueName: \"kubernetes.io/projected/e69c3c02-668d-42ba-9347-e5bea6cdf260-kube-api-access-5x5jz\") pod \"node-ca-zxxwm\" (UID: \"e69c3c02-668d-42ba-9347-e5bea6cdf260\") " pod="openshift-image-registry/node-ca-zxxwm" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.954793 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.962131 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.962181 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.962193 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.962218 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.962230 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:07Z","lastTransitionTime":"2025-11-25T15:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.980360 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:07 crc kubenswrapper[4743]: I1125 15:59:07.996351 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:07Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.010706 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.046863 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.064618 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.064657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.064669 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.064686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.064698 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:08Z","lastTransitionTime":"2025-11-25T15:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.067010 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zxxwm" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.089433 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.131425 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.167944 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.167982 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.168012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.168047 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.168060 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:08Z","lastTransitionTime":"2025-11-25T15:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.172479 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.207695 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.248086 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.271072 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.271125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.271139 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.271158 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.271170 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:08Z","lastTransitionTime":"2025-11-25T15:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.293188 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.321982 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:59:08 crc kubenswrapper[4743]: E1125 15:59:08.322138 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:59:16.322115752 +0000 UTC m=+35.443955301 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.325459 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.368314 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.374806 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.374856 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.374869 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.374889 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.374903 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:08Z","lastTransitionTime":"2025-11-25T15:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.409128 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.422784 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.422834 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.422868 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.422903 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:08 crc kubenswrapper[4743]: E1125 15:59:08.422981 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:59:08 crc kubenswrapper[4743]: E1125 15:59:08.423030 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:16.423014796 +0000 UTC m=+35.544854345 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:59:08 crc kubenswrapper[4743]: E1125 15:59:08.423113 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:59:08 crc kubenswrapper[4743]: E1125 15:59:08.423128 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:59:08 crc kubenswrapper[4743]: E1125 15:59:08.423140 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:08 crc kubenswrapper[4743]: E1125 15:59:08.423171 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:16.423162391 +0000 UTC m=+35.545001940 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:08 crc kubenswrapper[4743]: E1125 15:59:08.423212 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:59:08 crc kubenswrapper[4743]: E1125 15:59:08.423233 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:16.423226774 +0000 UTC m=+35.545066323 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:59:08 crc kubenswrapper[4743]: E1125 15:59:08.423270 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:59:08 crc kubenswrapper[4743]: E1125 15:59:08.423278 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:59:08 crc kubenswrapper[4743]: E1125 15:59:08.423285 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:08 crc kubenswrapper[4743]: E1125 15:59:08.423304 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:16.423298076 +0000 UTC m=+35.545137625 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.455923 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.477908 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.477949 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.477960 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.477980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.477990 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:08Z","lastTransitionTime":"2025-11-25T15:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.487252 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.527800 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.580388 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.580431 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.580440 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.580455 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.580467 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:08Z","lastTransitionTime":"2025-11-25T15:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.682886 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.682934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.682943 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.682958 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.682967 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:08Z","lastTransitionTime":"2025-11-25T15:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.774605 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:08 crc kubenswrapper[4743]: E1125 15:59:08.774762 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.785020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.785054 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.785066 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.785082 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.785094 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:08Z","lastTransitionTime":"2025-11-25T15:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.887539 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.887578 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.887602 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.887618 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.887629 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:08Z","lastTransitionTime":"2025-11-25T15:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.940327 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerStarted","Data":"42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560"} Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.940376 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerStarted","Data":"328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45"} Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.942965 4743 generic.go:334] "Generic (PLEG): container finished" podID="c2d6248c-be7e-48f3-b314-6089c361b67a" containerID="4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1" exitCode=0 Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.943028 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" event={"ID":"c2d6248c-be7e-48f3-b314-6089c361b67a","Type":"ContainerDied","Data":"4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1"} Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.944542 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zxxwm" event={"ID":"e69c3c02-668d-42ba-9347-e5bea6cdf260","Type":"ContainerStarted","Data":"ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f"} Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.944570 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zxxwm" event={"ID":"e69c3c02-668d-42ba-9347-e5bea6cdf260","Type":"ContainerStarted","Data":"22c458bae4bc46ef3458460836c9e58bd504cacd326621818d98e6ae8c872e27"} Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.968570 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.986709 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.990463 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.990500 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.990512 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.990528 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:08 crc kubenswrapper[4743]: I1125 15:59:08.990540 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:08Z","lastTransitionTime":"2025-11-25T15:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.001247 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.021058 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.033146 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.048836 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.064697 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.077672 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.089699 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.093534 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.093569 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.093586 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.093624 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.093637 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:09Z","lastTransitionTime":"2025-11-25T15:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.101764 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.112872 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.124126 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.133119 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.143871 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.159520 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.179011 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.195965 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.196002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.196011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.196026 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.196041 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:09Z","lastTransitionTime":"2025-11-25T15:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.206843 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.247838 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.292519 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.298049 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.298084 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.298101 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.298122 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.298135 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:09Z","lastTransitionTime":"2025-11-25T15:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.327364 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.371066 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.399916 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.399948 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.399957 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.399970 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.399978 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:09Z","lastTransitionTime":"2025-11-25T15:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.409288 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.448431 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.488707 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.502406 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.502451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.502464 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.502479 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.502489 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:09Z","lastTransitionTime":"2025-11-25T15:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.526697 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.567267 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.604915 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.604960 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.604970 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.604990 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.605002 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:09Z","lastTransitionTime":"2025-11-25T15:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.607960 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.644773 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.691249 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.706921 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.706963 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.706974 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.706989 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.707000 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:09Z","lastTransitionTime":"2025-11-25T15:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.728471 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.774229 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.774284 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:09 crc kubenswrapper[4743]: E1125 15:59:09.774454 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:09 crc kubenswrapper[4743]: E1125 15:59:09.774643 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.809701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.809741 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.809752 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.809768 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.809778 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:09Z","lastTransitionTime":"2025-11-25T15:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.912449 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.912481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.912490 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.912503 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.912512 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:09Z","lastTransitionTime":"2025-11-25T15:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.949405 4743 generic.go:334] "Generic (PLEG): container finished" podID="c2d6248c-be7e-48f3-b314-6089c361b67a" containerID="dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a" exitCode=0 Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.949454 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" event={"ID":"c2d6248c-be7e-48f3-b314-6089c361b67a","Type":"ContainerDied","Data":"dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a"} Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.962243 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.978413 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:09 crc kubenswrapper[4743]: I1125 15:59:09.993669 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.004561 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.014554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.014619 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.014636 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.014655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.014667 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:10Z","lastTransitionTime":"2025-11-25T15:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.017827 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.039962 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.060934 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.074721 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.086654 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.117040 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.117083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.117094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.117111 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.117121 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:10Z","lastTransitionTime":"2025-11-25T15:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.125249 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.166115 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.204358 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.220017 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.220083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.220094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.220111 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.220126 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:10Z","lastTransitionTime":"2025-11-25T15:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.247331 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.288503 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.323089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.323142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.323153 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.323171 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.323183 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:10Z","lastTransitionTime":"2025-11-25T15:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.328463 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.425894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.425940 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.425961 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.425983 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.425998 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:10Z","lastTransitionTime":"2025-11-25T15:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.528871 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.528909 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.528918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.528932 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.528941 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:10Z","lastTransitionTime":"2025-11-25T15:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.631324 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.631352 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.631360 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.631372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.631380 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:10Z","lastTransitionTime":"2025-11-25T15:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.733619 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.733662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.733673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.733699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.733712 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:10Z","lastTransitionTime":"2025-11-25T15:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.774151 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:10 crc kubenswrapper[4743]: E1125 15:59:10.774307 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.836397 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.836432 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.836466 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.836485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.836497 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:10Z","lastTransitionTime":"2025-11-25T15:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.939016 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.939048 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.939056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.939069 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.939078 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:10Z","lastTransitionTime":"2025-11-25T15:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.961453 4743 generic.go:334] "Generic (PLEG): container finished" podID="c2d6248c-be7e-48f3-b314-6089c361b67a" containerID="bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b" exitCode=0 Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.961546 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" event={"ID":"c2d6248c-be7e-48f3-b314-6089c361b67a","Type":"ContainerDied","Data":"bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b"} Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.966195 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerStarted","Data":"1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d"} Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.976127 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:10 crc kubenswrapper[4743]: I1125 15:59:10.990975 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.007821 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.024310 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.042479 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.042557 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.042637 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.042678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.042701 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:11Z","lastTransitionTime":"2025-11-25T15:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.044243 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.064220 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.084959 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.106316 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.118292 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.132244 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.148282 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.148326 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.148336 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.148355 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.148369 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:11Z","lastTransitionTime":"2025-11-25T15:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.163531 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.180302 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.192903 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.205861 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.221890 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.249886 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.249914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.249922 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.249935 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.249942 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:11Z","lastTransitionTime":"2025-11-25T15:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.352275 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.352306 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.352314 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.352327 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.352336 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:11Z","lastTransitionTime":"2025-11-25T15:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.454598 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.454634 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.454642 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.454657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.454668 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:11Z","lastTransitionTime":"2025-11-25T15:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.557786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.557862 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.557885 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.557914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.557936 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:11Z","lastTransitionTime":"2025-11-25T15:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.661405 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.661457 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.661481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.661510 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.661527 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:11Z","lastTransitionTime":"2025-11-25T15:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.763725 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.763778 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.763795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.763817 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.763836 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:11Z","lastTransitionTime":"2025-11-25T15:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.774339 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.774415 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:11 crc kubenswrapper[4743]: E1125 15:59:11.774478 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:11 crc kubenswrapper[4743]: E1125 15:59:11.774555 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.794014 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.810189 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.830084 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.844805 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.863016 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.867789 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.867830 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.867841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.867859 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.867876 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:11Z","lastTransitionTime":"2025-11-25T15:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.884554 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.901243 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.914708 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.935854 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.948567 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.960941 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.970494 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.970529 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.970541 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.970559 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.970571 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:11Z","lastTransitionTime":"2025-11-25T15:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.973771 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.974156 4743 generic.go:334] "Generic (PLEG): container finished" podID="c2d6248c-be7e-48f3-b314-6089c361b67a" containerID="789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9" exitCode=0 Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.974240 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" event={"ID":"c2d6248c-be7e-48f3-b314-6089c361b67a","Type":"ContainerDied","Data":"789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9"} Nov 25 15:59:11 crc kubenswrapper[4743]: I1125 15:59:11.987001 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.001573 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.013457 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.026917 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.042849 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.057743 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.069427 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.074071 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.074113 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.074125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.074142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.074153 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:12Z","lastTransitionTime":"2025-11-25T15:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.084472 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.098064 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.111823 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.126244 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.141935 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.156217 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.170991 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.175912 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.175958 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.175971 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.175993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.176004 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:12Z","lastTransitionTime":"2025-11-25T15:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.188102 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.198893 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.209650 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.229185 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.278046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.278083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.278091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.278105 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.278114 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:12Z","lastTransitionTime":"2025-11-25T15:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.379535 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.379567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.379575 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.379617 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.379642 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:12Z","lastTransitionTime":"2025-11-25T15:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.481899 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.481931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.481943 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.481960 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.481973 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:12Z","lastTransitionTime":"2025-11-25T15:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.584436 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.584472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.584483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.584499 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.584513 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:12Z","lastTransitionTime":"2025-11-25T15:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.688127 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.688186 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.688206 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.688234 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.688255 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:12Z","lastTransitionTime":"2025-11-25T15:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.774791 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:12 crc kubenswrapper[4743]: E1125 15:59:12.774927 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.792818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.792870 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.792883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.792903 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.792916 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:12Z","lastTransitionTime":"2025-11-25T15:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.895311 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.895363 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.895374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.895393 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.895407 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:12Z","lastTransitionTime":"2025-11-25T15:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.981332 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" event={"ID":"c2d6248c-be7e-48f3-b314-6089c361b67a","Type":"ContainerStarted","Data":"f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b"} Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.985539 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerStarted","Data":"044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71"} Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.985788 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.985808 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.997361 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.997407 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.997417 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.997433 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.997443 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:12Z","lastTransitionTime":"2025-11-25T15:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:12 crc kubenswrapper[4743]: I1125 15:59:12.999208 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:12Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.010368 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.014443 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.024630 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.039261 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.058885 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.072240 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.085344 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.100485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.100520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.100531 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.100547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.100557 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:13Z","lastTransitionTime":"2025-11-25T15:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.101160 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.113097 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.127795 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.142391 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.157322 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.168673 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.181842 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.197201 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.202501 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.202554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.202567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.202583 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.202615 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:13Z","lastTransitionTime":"2025-11-25T15:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.218563 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.236646 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.252907 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.269411 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.286662 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.304976 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.305027 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.305039 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.305056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.305069 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:13Z","lastTransitionTime":"2025-11-25T15:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.312334 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.324189 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.336061 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.356296 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.368727 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.379003 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.395011 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.406954 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.406997 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.407007 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.407025 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.407037 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:13Z","lastTransitionTime":"2025-11-25T15:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.410295 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.422178 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.433237 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:13Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.509341 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.509385 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.509396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.509411 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.509423 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:13Z","lastTransitionTime":"2025-11-25T15:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.611353 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.611395 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.611412 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.611427 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.611439 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:13Z","lastTransitionTime":"2025-11-25T15:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.713421 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.713461 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.713471 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.713485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.713495 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:13Z","lastTransitionTime":"2025-11-25T15:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.774340 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.774340 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:13 crc kubenswrapper[4743]: E1125 15:59:13.774491 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:13 crc kubenswrapper[4743]: E1125 15:59:13.774525 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.815892 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.815925 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.815934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.815947 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.815957 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:13Z","lastTransitionTime":"2025-11-25T15:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.918762 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.918802 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.918812 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.918826 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.918835 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:13Z","lastTransitionTime":"2025-11-25T15:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:13 crc kubenswrapper[4743]: I1125 15:59:13.989612 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.014490 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.021393 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.021443 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.021461 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.021483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.021499 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:14Z","lastTransitionTime":"2025-11-25T15:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.035054 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.048033 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.074077 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.094289 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.105445 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.115877 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.124564 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.124625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.124638 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.124655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.124667 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:14Z","lastTransitionTime":"2025-11-25T15:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.127647 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.137648 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.147115 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.161117 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.175086 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.185480 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.199161 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.213479 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.226411 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.227622 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.227661 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.227682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.227698 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.227709 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:14Z","lastTransitionTime":"2025-11-25T15:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.329444 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.329481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.329493 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.329532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.329545 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:14Z","lastTransitionTime":"2025-11-25T15:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.432332 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.432378 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.432388 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.432405 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.432415 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:14Z","lastTransitionTime":"2025-11-25T15:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.535507 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.535553 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.535564 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.535583 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.535605 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:14Z","lastTransitionTime":"2025-11-25T15:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.638976 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.639061 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.639073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.639093 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.639115 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:14Z","lastTransitionTime":"2025-11-25T15:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.741483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.741527 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.741538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.741554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.741568 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:14Z","lastTransitionTime":"2025-11-25T15:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.774808 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:14 crc kubenswrapper[4743]: E1125 15:59:14.774933 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.843956 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.844012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.844024 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.844043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.844083 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:14Z","lastTransitionTime":"2025-11-25T15:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.947491 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.947529 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.947539 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.947554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:14 crc kubenswrapper[4743]: I1125 15:59:14.947564 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:14Z","lastTransitionTime":"2025-11-25T15:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.049968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.050008 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.050017 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.050032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.050042 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:15Z","lastTransitionTime":"2025-11-25T15:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.152257 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.152301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.152326 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.152343 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.152353 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:15Z","lastTransitionTime":"2025-11-25T15:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.254744 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.254790 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.254802 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.254820 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.254832 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:15Z","lastTransitionTime":"2025-11-25T15:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.357090 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.357141 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.357155 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.357170 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.357180 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:15Z","lastTransitionTime":"2025-11-25T15:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.459670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.459716 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.459727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.459744 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.459755 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:15Z","lastTransitionTime":"2025-11-25T15:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.561923 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.562401 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.562417 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.562432 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.562442 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:15Z","lastTransitionTime":"2025-11-25T15:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.664167 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.664208 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.664217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.664230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.664243 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:15Z","lastTransitionTime":"2025-11-25T15:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.766681 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.766717 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.766726 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.766741 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.766750 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:15Z","lastTransitionTime":"2025-11-25T15:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.774290 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.774296 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:15 crc kubenswrapper[4743]: E1125 15:59:15.774420 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:15 crc kubenswrapper[4743]: E1125 15:59:15.774536 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.869484 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.869571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.869581 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.869616 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.869628 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:15Z","lastTransitionTime":"2025-11-25T15:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.971655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.971712 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.971722 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.971740 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.971752 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:15Z","lastTransitionTime":"2025-11-25T15:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.996199 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovnkube-controller/0.log" Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.998785 4743 generic.go:334] "Generic (PLEG): container finished" podID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerID="044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71" exitCode=1 Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.998834 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerDied","Data":"044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71"} Nov 25 15:59:15 crc kubenswrapper[4743]: I1125 15:59:15.999654 4743 scope.go:117] "RemoveContainer" containerID="044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.022554 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.034193 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.046061 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.063651 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:15Z\\\",\\\"message\\\":\\\"oval\\\\nI1125 15:59:15.189168 6050 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1125 15:59:15.189175 6050 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1125 15:59:15.189199 6050 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1125 15:59:15.189767 6050 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:59:15.189782 6050 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 15:59:15.189783 6050 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:59:15.189822 6050 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:59:15.189852 6050 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:59:15.189877 6050 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:59:15.189889 6050 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:59:15.189905 6050 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:59:15.189949 6050 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:59:15.189966 6050 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:59:15.189969 6050 factory.go:656] Stopping watch factory\\\\nI1125 15:59:15.189976 6050 handler.go:208] Removed *v1.Node event handler 7\\\\nI1125 15:59:15.189985 6050 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:59:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.074301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.074337 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.074345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.074360 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.074369 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:16Z","lastTransitionTime":"2025-11-25T15:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.077575 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.091260 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.103204 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.113131 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.124701 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.134061 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.147668 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.161020 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.175180 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.176693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.176725 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.176736 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.176753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.176766 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:16Z","lastTransitionTime":"2025-11-25T15:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.188210 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.205071 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.279176 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.279233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.279246 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.279263 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.279277 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:16Z","lastTransitionTime":"2025-11-25T15:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.382172 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.382211 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.382225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.382239 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.382249 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:16Z","lastTransitionTime":"2025-11-25T15:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.400512 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:59:16 crc kubenswrapper[4743]: E1125 15:59:16.400731 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:59:32.400697775 +0000 UTC m=+51.522537324 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.484695 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.484746 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.484754 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.484769 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.484778 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:16Z","lastTransitionTime":"2025-11-25T15:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.501125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.501178 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.501205 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.501232 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:16 crc kubenswrapper[4743]: E1125 15:59:16.501288 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:59:16 crc kubenswrapper[4743]: E1125 15:59:16.501309 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:59:16 crc kubenswrapper[4743]: E1125 15:59:16.501323 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:59:16 crc kubenswrapper[4743]: E1125 15:59:16.501331 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:59:16 crc kubenswrapper[4743]: E1125 15:59:16.501335 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:16 crc kubenswrapper[4743]: E1125 15:59:16.501352 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:59:16 crc kubenswrapper[4743]: E1125 15:59:16.501369 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:32.501354531 +0000 UTC m=+51.623194080 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:59:16 crc kubenswrapper[4743]: E1125 15:59:16.501371 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:59:16 crc kubenswrapper[4743]: E1125 15:59:16.501385 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:16 crc kubenswrapper[4743]: E1125 15:59:16.501391 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:32.501376222 +0000 UTC m=+51.623215771 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:16 crc kubenswrapper[4743]: E1125 15:59:16.501407 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:32.501401193 +0000 UTC m=+51.623240742 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:59:16 crc kubenswrapper[4743]: E1125 15:59:16.501417 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:59:32.501412593 +0000 UTC m=+51.623252132 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.587803 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.587872 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.587885 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.587905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.587920 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:16Z","lastTransitionTime":"2025-11-25T15:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.690547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.690640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.690657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.690684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.690711 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:16Z","lastTransitionTime":"2025-11-25T15:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.712762 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.724406 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.736637 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.745885 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.759063 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.772189 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.774276 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:16 crc kubenswrapper[4743]: E1125 15:59:16.774436 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.791277 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.792529 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.792560 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.792569 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.792605 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.792621 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:16Z","lastTransitionTime":"2025-11-25T15:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.804675 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.841488 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.868676 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:15Z\\\",\\\"message\\\":\\\"oval\\\\nI1125 15:59:15.189168 6050 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1125 15:59:15.189175 6050 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1125 15:59:15.189199 6050 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1125 15:59:15.189767 6050 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:59:15.189782 6050 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 15:59:15.189783 6050 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:59:15.189822 6050 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:59:15.189852 6050 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:59:15.189877 6050 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:59:15.189889 6050 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:59:15.189905 6050 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:59:15.189949 6050 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:59:15.189966 6050 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:59:15.189969 6050 factory.go:656] Stopping watch factory\\\\nI1125 15:59:15.189976 6050 handler.go:208] Removed *v1.Node event handler 7\\\\nI1125 15:59:15.189985 6050 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:59:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.879980 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.890113 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.894645 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.894680 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.894689 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.894705 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.894715 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:16Z","lastTransitionTime":"2025-11-25T15:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.900434 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.910210 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.919226 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.926794 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.936188 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n"] Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.936539 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.938225 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.939716 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.951478 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.965164 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.977829 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.990441 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.997002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.997058 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.997082 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.997102 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:16 crc kubenswrapper[4743]: I1125 15:59:16.997115 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:16Z","lastTransitionTime":"2025-11-25T15:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.003265 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.006051 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8169051d-4e2b-48e2-96e1-c113cadf2d76-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fhb8n\" (UID: \"8169051d-4e2b-48e2-96e1-c113cadf2d76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.006091 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8169051d-4e2b-48e2-96e1-c113cadf2d76-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fhb8n\" (UID: \"8169051d-4e2b-48e2-96e1-c113cadf2d76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.006140 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8169051d-4e2b-48e2-96e1-c113cadf2d76-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fhb8n\" (UID: \"8169051d-4e2b-48e2-96e1-c113cadf2d76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.006163 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjzf9\" (UniqueName: \"kubernetes.io/projected/8169051d-4e2b-48e2-96e1-c113cadf2d76-kube-api-access-qjzf9\") pod \"ovnkube-control-plane-749d76644c-fhb8n\" (UID: \"8169051d-4e2b-48e2-96e1-c113cadf2d76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.007437 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovnkube-controller/0.log" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.009958 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerStarted","Data":"4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed"} Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.010259 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.026760 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.040924 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.055896 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.076126 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:15Z\\\",\\\"message\\\":\\\"oval\\\\nI1125 15:59:15.189168 6050 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1125 15:59:15.189175 6050 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1125 15:59:15.189199 6050 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1125 15:59:15.189767 6050 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:59:15.189782 6050 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 15:59:15.189783 6050 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:59:15.189822 6050 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:59:15.189852 6050 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:59:15.189877 6050 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:59:15.189889 6050 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:59:15.189905 6050 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:59:15.189949 6050 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:59:15.189966 6050 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:59:15.189969 6050 factory.go:656] Stopping watch factory\\\\nI1125 15:59:15.189976 6050 handler.go:208] Removed *v1.Node event handler 7\\\\nI1125 15:59:15.189985 6050 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:59:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.091812 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.099709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.099887 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.099958 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.100024 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.100080 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:17Z","lastTransitionTime":"2025-11-25T15:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.105858 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.107460 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8169051d-4e2b-48e2-96e1-c113cadf2d76-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fhb8n\" (UID: \"8169051d-4e2b-48e2-96e1-c113cadf2d76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.107540 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjzf9\" (UniqueName: \"kubernetes.io/projected/8169051d-4e2b-48e2-96e1-c113cadf2d76-kube-api-access-qjzf9\") pod \"ovnkube-control-plane-749d76644c-fhb8n\" (UID: \"8169051d-4e2b-48e2-96e1-c113cadf2d76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.107611 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8169051d-4e2b-48e2-96e1-c113cadf2d76-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fhb8n\" (UID: \"8169051d-4e2b-48e2-96e1-c113cadf2d76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.107640 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8169051d-4e2b-48e2-96e1-c113cadf2d76-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fhb8n\" (UID: \"8169051d-4e2b-48e2-96e1-c113cadf2d76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.108246 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8169051d-4e2b-48e2-96e1-c113cadf2d76-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fhb8n\" (UID: \"8169051d-4e2b-48e2-96e1-c113cadf2d76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.108642 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8169051d-4e2b-48e2-96e1-c113cadf2d76-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fhb8n\" (UID: \"8169051d-4e2b-48e2-96e1-c113cadf2d76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.114201 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8169051d-4e2b-48e2-96e1-c113cadf2d76-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fhb8n\" (UID: \"8169051d-4e2b-48e2-96e1-c113cadf2d76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.120131 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.126211 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjzf9\" (UniqueName: \"kubernetes.io/projected/8169051d-4e2b-48e2-96e1-c113cadf2d76-kube-api-access-qjzf9\") pod \"ovnkube-control-plane-749d76644c-fhb8n\" (UID: \"8169051d-4e2b-48e2-96e1-c113cadf2d76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.131755 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.145326 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.157899 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.169928 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.181449 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.193370 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.201892 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.201930 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.201938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.201953 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.201962 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:17Z","lastTransitionTime":"2025-11-25T15:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.211665 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:15Z\\\",\\\"message\\\":\\\"oval\\\\nI1125 15:59:15.189168 6050 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1125 15:59:15.189175 6050 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1125 15:59:15.189199 6050 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1125 15:59:15.189767 6050 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:59:15.189782 6050 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 15:59:15.189783 6050 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:59:15.189822 6050 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:59:15.189852 6050 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:59:15.189877 6050 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:59:15.189889 6050 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:59:15.189905 6050 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:59:15.189949 6050 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:59:15.189966 6050 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:59:15.189969 6050 factory.go:656] Stopping watch factory\\\\nI1125 15:59:15.189976 6050 handler.go:208] Removed *v1.Node event handler 7\\\\nI1125 15:59:15.189985 6050 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:59:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.213073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.213104 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.213112 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.213129 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.213139 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:17Z","lastTransitionTime":"2025-11-25T15:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:17 crc kubenswrapper[4743]: E1125 15:59:17.225083 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.228389 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.228432 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.228441 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.228459 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.228476 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:17Z","lastTransitionTime":"2025-11-25T15:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.231848 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: E1125 15:59:17.238728 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.241717 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.241748 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.241758 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.241776 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.241786 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:17Z","lastTransitionTime":"2025-11-25T15:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.243012 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.248041 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" Nov 25 15:59:17 crc kubenswrapper[4743]: E1125 15:59:17.252901 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.253833 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.256332 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.256378 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.256394 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.256416 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.256432 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:17Z","lastTransitionTime":"2025-11-25T15:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:17 crc kubenswrapper[4743]: W1125 15:59:17.263913 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8169051d_4e2b_48e2_96e1_c113cadf2d76.slice/crio-7b16b11fea0f21e6ea9b44fe3e055a91bdc43dde0762dbec0b66f59fda3cff44 WatchSource:0}: Error finding container 7b16b11fea0f21e6ea9b44fe3e055a91bdc43dde0762dbec0b66f59fda3cff44: Status 404 returned error can't find the container with id 7b16b11fea0f21e6ea9b44fe3e055a91bdc43dde0762dbec0b66f59fda3cff44 Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.266754 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: E1125 15:59:17.271555 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.277718 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.277853 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.277935 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.278032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.278190 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:17Z","lastTransitionTime":"2025-11-25T15:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.282548 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: E1125 15:59:17.291925 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: E1125 15:59:17.292041 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.294479 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.304490 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.304563 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.304575 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.304643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.304659 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:17Z","lastTransitionTime":"2025-11-25T15:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.310169 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.324388 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.335269 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.348928 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.363080 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.375299 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.387356 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:17Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.407928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.407990 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.408001 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.408017 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.408027 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:17Z","lastTransitionTime":"2025-11-25T15:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.510976 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.511012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.511020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.511034 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.511043 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:17Z","lastTransitionTime":"2025-11-25T15:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.612681 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.612740 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.612768 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.612792 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.612806 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:17Z","lastTransitionTime":"2025-11-25T15:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.715301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.715339 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.715350 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.715365 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.715375 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:17Z","lastTransitionTime":"2025-11-25T15:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.773948 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.773980 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:17 crc kubenswrapper[4743]: E1125 15:59:17.774075 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:17 crc kubenswrapper[4743]: E1125 15:59:17.774147 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.817064 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.817335 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.817415 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.817546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.817659 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:17Z","lastTransitionTime":"2025-11-25T15:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.919291 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.919655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.919756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.919847 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:17 crc kubenswrapper[4743]: I1125 15:59:17.919950 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:17Z","lastTransitionTime":"2025-11-25T15:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.017282 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovnkube-controller/1.log" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.017765 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovnkube-controller/0.log" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.020822 4743 generic.go:334] "Generic (PLEG): container finished" podID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerID="4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed" exitCode=1 Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.020932 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerDied","Data":"4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed"} Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.020975 4743 scope.go:117] "RemoveContainer" containerID="044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.021375 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.021404 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.021413 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.021425 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.021434 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:18Z","lastTransitionTime":"2025-11-25T15:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.021913 4743 scope.go:117] "RemoveContainer" containerID="4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed" Nov 25 15:59:18 crc kubenswrapper[4743]: E1125 15:59:18.022180 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.022380 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" event={"ID":"8169051d-4e2b-48e2-96e1-c113cadf2d76","Type":"ContainerStarted","Data":"7b16b11fea0f21e6ea9b44fe3e055a91bdc43dde0762dbec0b66f59fda3cff44"} Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.040125 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.051377 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.063475 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.080494 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:15Z\\\",\\\"message\\\":\\\"oval\\\\nI1125 15:59:15.189168 6050 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1125 15:59:15.189175 6050 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1125 15:59:15.189199 6050 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1125 15:59:15.189767 6050 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:59:15.189782 6050 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 15:59:15.189783 6050 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:59:15.189822 6050 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:59:15.189852 6050 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:59:15.189877 6050 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:59:15.189889 6050 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:59:15.189905 6050 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:59:15.189949 6050 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:59:15.189966 6050 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:59:15.189969 6050 factory.go:656] Stopping watch factory\\\\nI1125 15:59:15.189976 6050 handler.go:208] Removed *v1.Node event handler 7\\\\nI1125 15:59:15.189985 6050 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:59:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"t network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:59:16.804231 6194 services_controller.go:451] Built service openshift-controller-manager/controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager/controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.149\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateM\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.095229 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.109229 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.122395 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.123425 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.123466 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.123477 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.123496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.123508 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:18Z","lastTransitionTime":"2025-11-25T15:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.134536 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.147832 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.164110 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.177337 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.189724 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.200075 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.211707 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.226655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.226687 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.226698 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.226712 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.226724 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:18Z","lastTransitionTime":"2025-11-25T15:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.229003 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.240692 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.329947 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.329990 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.330001 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.330020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.330032 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:18Z","lastTransitionTime":"2025-11-25T15:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.432153 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.432223 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.432237 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.432258 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.432271 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:18Z","lastTransitionTime":"2025-11-25T15:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.534927 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.534975 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.534989 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.535005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.535016 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:18Z","lastTransitionTime":"2025-11-25T15:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.637203 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.637305 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.637323 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.637354 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.637375 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:18Z","lastTransitionTime":"2025-11-25T15:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.740322 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.740360 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.740370 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.740383 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.740392 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:18Z","lastTransitionTime":"2025-11-25T15:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.773920 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:18 crc kubenswrapper[4743]: E1125 15:59:18.774071 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.786073 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-s9t79"] Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.786832 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:18 crc kubenswrapper[4743]: E1125 15:59:18.786932 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.805061 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.820067 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.838345 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.843633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.843713 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.843735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.843766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.843787 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:18Z","lastTransitionTime":"2025-11-25T15:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.854483 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.867565 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.879862 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.901159 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.922103 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://044bc1e614b05a2388637be4aa7c87f76eec1d09a3eed4259013fccdaa5a0e71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:15Z\\\",\\\"message\\\":\\\"oval\\\\nI1125 15:59:15.189168 6050 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1125 15:59:15.189175 6050 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1125 15:59:15.189199 6050 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1125 15:59:15.189767 6050 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:59:15.189782 6050 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 15:59:15.189783 6050 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:59:15.189822 6050 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:59:15.189852 6050 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:59:15.189877 6050 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:59:15.189889 6050 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:59:15.189905 6050 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:59:15.189949 6050 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:59:15.189966 6050 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:59:15.189969 6050 factory.go:656] Stopping watch factory\\\\nI1125 15:59:15.189976 6050 handler.go:208] Removed *v1.Node event handler 7\\\\nI1125 15:59:15.189985 6050 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:59:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"t network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:59:16.804231 6194 services_controller.go:451] Built service openshift-controller-manager/controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager/controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.149\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateM\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.925302 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd9cz\" (UniqueName: \"kubernetes.io/projected/617512f9-f767-4615-a9d2-132c6c73a69d-kube-api-access-bd9cz\") pod \"network-metrics-daemon-s9t79\" (UID: \"617512f9-f767-4615-a9d2-132c6c73a69d\") " pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.925450 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs\") pod \"network-metrics-daemon-s9t79\" (UID: \"617512f9-f767-4615-a9d2-132c6c73a69d\") " pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.945416 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.946320 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.946348 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.946358 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.946390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.946413 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:18Z","lastTransitionTime":"2025-11-25T15:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.959910 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.975226 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:18 crc kubenswrapper[4743]: I1125 15:59:18.990622 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:18Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.005492 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.020956 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.026556 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd9cz\" (UniqueName: \"kubernetes.io/projected/617512f9-f767-4615-a9d2-132c6c73a69d-kube-api-access-bd9cz\") pod \"network-metrics-daemon-s9t79\" (UID: \"617512f9-f767-4615-a9d2-132c6c73a69d\") " pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.026696 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs\") pod \"network-metrics-daemon-s9t79\" (UID: \"617512f9-f767-4615-a9d2-132c6c73a69d\") " pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:19 crc kubenswrapper[4743]: E1125 15:59:19.026858 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:59:19 crc kubenswrapper[4743]: E1125 15:59:19.026951 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs podName:617512f9-f767-4615-a9d2-132c6c73a69d nodeName:}" failed. No retries permitted until 2025-11-25 15:59:19.526906282 +0000 UTC m=+38.648745851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs") pod "network-metrics-daemon-s9t79" (UID: "617512f9-f767-4615-a9d2-132c6c73a69d") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.028668 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" event={"ID":"8169051d-4e2b-48e2-96e1-c113cadf2d76","Type":"ContainerStarted","Data":"912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae"} Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.028776 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" event={"ID":"8169051d-4e2b-48e2-96e1-c113cadf2d76","Type":"ContainerStarted","Data":"3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6"} Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.030808 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovnkube-controller/1.log" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.035121 4743 scope.go:117] "RemoveContainer" containerID="4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed" Nov 25 15:59:19 crc kubenswrapper[4743]: E1125 15:59:19.035255 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.039352 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.048835 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.048886 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.048898 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.048919 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.048932 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:19Z","lastTransitionTime":"2025-11-25T15:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.049727 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd9cz\" (UniqueName: \"kubernetes.io/projected/617512f9-f767-4615-a9d2-132c6c73a69d-kube-api-access-bd9cz\") pod \"network-metrics-daemon-s9t79\" (UID: \"617512f9-f767-4615-a9d2-132c6c73a69d\") " pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.055957 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.074313 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.093576 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.110705 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.125040 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.139009 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.150416 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.152613 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.152684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.152699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.152729 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.152744 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:19Z","lastTransitionTime":"2025-11-25T15:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.162728 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.177132 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.191384 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.208116 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.221806 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.239447 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.255548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.255626 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.255637 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.255657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.255674 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:19Z","lastTransitionTime":"2025-11-25T15:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.260399 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.274906 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.297026 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.315104 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.331617 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.351289 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"t network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:59:16.804231 6194 services_controller.go:451] Built service openshift-controller-manager/controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager/controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.149\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateM\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:19Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.358661 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.358735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.358748 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.358769 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.358782 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:19Z","lastTransitionTime":"2025-11-25T15:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.462155 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.462239 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.462264 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.462297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.462324 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:19Z","lastTransitionTime":"2025-11-25T15:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.532553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs\") pod \"network-metrics-daemon-s9t79\" (UID: \"617512f9-f767-4615-a9d2-132c6c73a69d\") " pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:19 crc kubenswrapper[4743]: E1125 15:59:19.532747 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:59:19 crc kubenswrapper[4743]: E1125 15:59:19.532814 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs podName:617512f9-f767-4615-a9d2-132c6c73a69d nodeName:}" failed. No retries permitted until 2025-11-25 15:59:20.532793382 +0000 UTC m=+39.654632941 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs") pod "network-metrics-daemon-s9t79" (UID: "617512f9-f767-4615-a9d2-132c6c73a69d") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.565113 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.565168 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.565181 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.565200 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.565212 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:19Z","lastTransitionTime":"2025-11-25T15:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.668570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.668623 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.668632 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.668647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.668657 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:19Z","lastTransitionTime":"2025-11-25T15:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.771362 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.771422 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.771433 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.771448 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.771462 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:19Z","lastTransitionTime":"2025-11-25T15:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.773915 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.773925 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:19 crc kubenswrapper[4743]: E1125 15:59:19.774039 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:19 crc kubenswrapper[4743]: E1125 15:59:19.774129 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.874285 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.874346 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.874358 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.874379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.874391 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:19Z","lastTransitionTime":"2025-11-25T15:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.976967 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.977003 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.977012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.977026 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:19 crc kubenswrapper[4743]: I1125 15:59:19.977035 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:19Z","lastTransitionTime":"2025-11-25T15:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.079211 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.079251 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.079262 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.079277 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.079287 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:20Z","lastTransitionTime":"2025-11-25T15:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.182232 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.182269 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.182281 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.182300 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.182312 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:20Z","lastTransitionTime":"2025-11-25T15:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.284805 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.284841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.284852 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.284870 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.284881 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:20Z","lastTransitionTime":"2025-11-25T15:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.387460 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.387501 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.387514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.387531 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.387545 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:20Z","lastTransitionTime":"2025-11-25T15:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.493258 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.493807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.493820 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.493839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.493856 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:20Z","lastTransitionTime":"2025-11-25T15:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.543352 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs\") pod \"network-metrics-daemon-s9t79\" (UID: \"617512f9-f767-4615-a9d2-132c6c73a69d\") " pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:20 crc kubenswrapper[4743]: E1125 15:59:20.543480 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:59:20 crc kubenswrapper[4743]: E1125 15:59:20.543548 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs podName:617512f9-f767-4615-a9d2-132c6c73a69d nodeName:}" failed. No retries permitted until 2025-11-25 15:59:22.543529868 +0000 UTC m=+41.665369417 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs") pod "network-metrics-daemon-s9t79" (UID: "617512f9-f767-4615-a9d2-132c6c73a69d") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.596522 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.596562 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.596571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.596585 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.596612 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:20Z","lastTransitionTime":"2025-11-25T15:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.698917 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.698951 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.698964 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.698983 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.698997 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:20Z","lastTransitionTime":"2025-11-25T15:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.773895 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.773895 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:20 crc kubenswrapper[4743]: E1125 15:59:20.774052 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:20 crc kubenswrapper[4743]: E1125 15:59:20.774112 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.800970 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.801010 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.801021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.801036 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.801045 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:20Z","lastTransitionTime":"2025-11-25T15:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.904120 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.904246 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.904282 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.904322 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:20 crc kubenswrapper[4743]: I1125 15:59:20.904356 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:20Z","lastTransitionTime":"2025-11-25T15:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.007637 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.007674 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.007686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.007699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.007708 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:21Z","lastTransitionTime":"2025-11-25T15:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.110462 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.110536 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.110549 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.110568 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.110580 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:21Z","lastTransitionTime":"2025-11-25T15:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.213480 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.213534 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.213548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.213568 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.213580 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:21Z","lastTransitionTime":"2025-11-25T15:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.316414 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.316459 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.316466 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.316481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.316490 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:21Z","lastTransitionTime":"2025-11-25T15:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.421059 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.421115 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.421127 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.421144 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.421157 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:21Z","lastTransitionTime":"2025-11-25T15:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.523764 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.523807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.523819 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.523833 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.523843 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:21Z","lastTransitionTime":"2025-11-25T15:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.626271 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.626317 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.626329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.626345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.626356 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:21Z","lastTransitionTime":"2025-11-25T15:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.728824 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.728896 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.728909 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.728926 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.728962 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:21Z","lastTransitionTime":"2025-11-25T15:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.774683 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:21 crc kubenswrapper[4743]: E1125 15:59:21.774919 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.774976 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:21 crc kubenswrapper[4743]: E1125 15:59:21.775083 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.789411 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.802226 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.811292 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.824798 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.830575 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.830627 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.830642 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.830663 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.830676 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:21Z","lastTransitionTime":"2025-11-25T15:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.840635 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.849420 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.866959 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.878325 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.894120 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.911043 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"t network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:59:16.804231 6194 services_controller.go:451] Built service openshift-controller-manager/controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager/controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.149\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateM\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.922735 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.932710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.932747 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.932756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.932770 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.932791 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:21Z","lastTransitionTime":"2025-11-25T15:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.935238 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.945359 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.955191 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.967780 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.978656 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:21 crc kubenswrapper[4743]: I1125 15:59:21.987973 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.035413 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.035454 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.035462 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.035476 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.035490 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:22Z","lastTransitionTime":"2025-11-25T15:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.138440 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.138501 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.138514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.138534 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.138546 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:22Z","lastTransitionTime":"2025-11-25T15:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.241263 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.241301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.241313 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.241330 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.241343 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:22Z","lastTransitionTime":"2025-11-25T15:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.343760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.343820 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.343834 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.343850 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.343860 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:22Z","lastTransitionTime":"2025-11-25T15:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.446169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.446215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.446223 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.446237 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.446246 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:22Z","lastTransitionTime":"2025-11-25T15:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.548995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.549041 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.549049 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.549068 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.549079 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:22Z","lastTransitionTime":"2025-11-25T15:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.562499 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs\") pod \"network-metrics-daemon-s9t79\" (UID: \"617512f9-f767-4615-a9d2-132c6c73a69d\") " pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:22 crc kubenswrapper[4743]: E1125 15:59:22.562643 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:59:22 crc kubenswrapper[4743]: E1125 15:59:22.562706 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs podName:617512f9-f767-4615-a9d2-132c6c73a69d nodeName:}" failed. No retries permitted until 2025-11-25 15:59:26.56268701 +0000 UTC m=+45.684526559 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs") pod "network-metrics-daemon-s9t79" (UID: "617512f9-f767-4615-a9d2-132c6c73a69d") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.651334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.651364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.651396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.651409 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.651419 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:22Z","lastTransitionTime":"2025-11-25T15:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.754143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.754214 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.754231 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.754251 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.754262 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:22Z","lastTransitionTime":"2025-11-25T15:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.774756 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.774756 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:22 crc kubenswrapper[4743]: E1125 15:59:22.774882 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:22 crc kubenswrapper[4743]: E1125 15:59:22.774944 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.856963 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.857003 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.857012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.857026 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.857035 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:22Z","lastTransitionTime":"2025-11-25T15:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.959120 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.959156 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.959165 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.959180 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:22 crc kubenswrapper[4743]: I1125 15:59:22.959190 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:22Z","lastTransitionTime":"2025-11-25T15:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.061274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.061305 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.061314 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.061328 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.061338 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:23Z","lastTransitionTime":"2025-11-25T15:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.162801 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.162830 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.162854 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.162868 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.162876 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:23Z","lastTransitionTime":"2025-11-25T15:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.265447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.265485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.265496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.265513 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.265525 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:23Z","lastTransitionTime":"2025-11-25T15:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.367884 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.367951 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.367967 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.367984 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.367997 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:23Z","lastTransitionTime":"2025-11-25T15:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.470531 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.470575 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.470586 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.470624 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.470638 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:23Z","lastTransitionTime":"2025-11-25T15:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.573508 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.573559 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.573571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.573615 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.573628 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:23Z","lastTransitionTime":"2025-11-25T15:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.675861 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.675911 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.675920 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.675934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.675944 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:23Z","lastTransitionTime":"2025-11-25T15:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.774575 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:23 crc kubenswrapper[4743]: E1125 15:59:23.774787 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.774886 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:23 crc kubenswrapper[4743]: E1125 15:59:23.775133 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.778044 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.778468 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.778657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.778811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.778946 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:23Z","lastTransitionTime":"2025-11-25T15:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.883171 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.883463 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.883559 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.883703 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.883799 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:23Z","lastTransitionTime":"2025-11-25T15:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.986360 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.986639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.986753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.986904 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:23 crc kubenswrapper[4743]: I1125 15:59:23.987015 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:23Z","lastTransitionTime":"2025-11-25T15:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.089701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.089763 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.089780 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.089802 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.089829 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:24Z","lastTransitionTime":"2025-11-25T15:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.192073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.192128 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.192141 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.192159 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.192171 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:24Z","lastTransitionTime":"2025-11-25T15:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.294777 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.294815 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.294826 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.294844 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.294855 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:24Z","lastTransitionTime":"2025-11-25T15:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.397484 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.397522 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.397531 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.397548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.397559 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:24Z","lastTransitionTime":"2025-11-25T15:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.500130 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.500179 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.500193 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.500210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.500223 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:24Z","lastTransitionTime":"2025-11-25T15:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.602658 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.602688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.602696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.602708 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.602718 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:24Z","lastTransitionTime":"2025-11-25T15:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.706031 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.706175 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.706206 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.706231 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.706248 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:24Z","lastTransitionTime":"2025-11-25T15:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.774302 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.774376 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:24 crc kubenswrapper[4743]: E1125 15:59:24.774450 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:24 crc kubenswrapper[4743]: E1125 15:59:24.774527 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.809333 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.809374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.809386 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.809404 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.809415 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:24Z","lastTransitionTime":"2025-11-25T15:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.911583 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.911634 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.911645 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.911658 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:24 crc kubenswrapper[4743]: I1125 15:59:24.911668 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:24Z","lastTransitionTime":"2025-11-25T15:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.014416 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.014512 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.014553 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.014575 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.014613 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:25Z","lastTransitionTime":"2025-11-25T15:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.116747 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.116776 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.116784 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.116805 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.116822 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:25Z","lastTransitionTime":"2025-11-25T15:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.219723 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.219770 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.219786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.219806 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.219823 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:25Z","lastTransitionTime":"2025-11-25T15:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.322140 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.322207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.322221 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.322238 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.322324 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:25Z","lastTransitionTime":"2025-11-25T15:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.424203 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.424233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.424241 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.424254 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.424263 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:25Z","lastTransitionTime":"2025-11-25T15:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.527187 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.527241 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.527261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.527286 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.527306 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:25Z","lastTransitionTime":"2025-11-25T15:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.629779 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.629827 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.629836 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.629851 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.629860 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:25Z","lastTransitionTime":"2025-11-25T15:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.732366 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.732405 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.732415 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.732430 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.732440 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:25Z","lastTransitionTime":"2025-11-25T15:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.774144 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.774253 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:25 crc kubenswrapper[4743]: E1125 15:59:25.774374 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:25 crc kubenswrapper[4743]: E1125 15:59:25.774463 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.834606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.834647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.834657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.834672 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.834681 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:25Z","lastTransitionTime":"2025-11-25T15:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.937374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.937440 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.937452 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.937468 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:25 crc kubenswrapper[4743]: I1125 15:59:25.937478 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:25Z","lastTransitionTime":"2025-11-25T15:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.040211 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.040258 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.040268 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.040284 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.040294 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:26Z","lastTransitionTime":"2025-11-25T15:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.142323 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.142386 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.142399 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.142414 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.142424 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:26Z","lastTransitionTime":"2025-11-25T15:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.244895 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.244944 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.244954 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.244967 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.244976 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:26Z","lastTransitionTime":"2025-11-25T15:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.346996 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.347057 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.347072 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.347089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.347098 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:26Z","lastTransitionTime":"2025-11-25T15:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.450310 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.450358 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.450371 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.450395 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.450409 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:26Z","lastTransitionTime":"2025-11-25T15:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.554399 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.554452 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.554465 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.554487 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.554502 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:26Z","lastTransitionTime":"2025-11-25T15:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.605124 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs\") pod \"network-metrics-daemon-s9t79\" (UID: \"617512f9-f767-4615-a9d2-132c6c73a69d\") " pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:26 crc kubenswrapper[4743]: E1125 15:59:26.605399 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:59:26 crc kubenswrapper[4743]: E1125 15:59:26.605486 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs podName:617512f9-f767-4615-a9d2-132c6c73a69d nodeName:}" failed. No retries permitted until 2025-11-25 15:59:34.605460018 +0000 UTC m=+53.727299567 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs") pod "network-metrics-daemon-s9t79" (UID: "617512f9-f767-4615-a9d2-132c6c73a69d") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.657722 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.657763 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.657792 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.657810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.657824 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:26Z","lastTransitionTime":"2025-11-25T15:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.761255 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.761310 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.761320 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.761344 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.761358 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:26Z","lastTransitionTime":"2025-11-25T15:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.774670 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.774802 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:26 crc kubenswrapper[4743]: E1125 15:59:26.774861 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:26 crc kubenswrapper[4743]: E1125 15:59:26.775016 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.864874 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.864926 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.864935 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.864960 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.864969 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:26Z","lastTransitionTime":"2025-11-25T15:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.967576 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.967697 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.967712 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.967728 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:26 crc kubenswrapper[4743]: I1125 15:59:26.967740 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:26Z","lastTransitionTime":"2025-11-25T15:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.070033 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.070072 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.070088 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.070105 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.070116 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:27Z","lastTransitionTime":"2025-11-25T15:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.172584 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.172653 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.172664 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.172683 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.172695 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:27Z","lastTransitionTime":"2025-11-25T15:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.274625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.274660 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.274668 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.274682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.274691 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:27Z","lastTransitionTime":"2025-11-25T15:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.377201 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.377233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.377243 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.377258 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.377269 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:27Z","lastTransitionTime":"2025-11-25T15:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.480270 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.480301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.480311 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.480324 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.480333 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:27Z","lastTransitionTime":"2025-11-25T15:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.583266 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.583301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.583309 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.583324 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.583335 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:27Z","lastTransitionTime":"2025-11-25T15:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.608774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.608830 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.608847 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.608870 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.608890 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:27Z","lastTransitionTime":"2025-11-25T15:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:27 crc kubenswrapper[4743]: E1125 15:59:27.626117 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.633764 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.633844 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.633860 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.633884 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.633908 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:27Z","lastTransitionTime":"2025-11-25T15:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:27 crc kubenswrapper[4743]: E1125 15:59:27.646760 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.651530 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.651575 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.651584 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.651618 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.651629 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:27Z","lastTransitionTime":"2025-11-25T15:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:27 crc kubenswrapper[4743]: E1125 15:59:27.663900 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.668868 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.668969 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.668995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.669033 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.669059 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:27Z","lastTransitionTime":"2025-11-25T15:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:27 crc kubenswrapper[4743]: E1125 15:59:27.687129 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.692676 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.692724 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.692738 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.692757 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.692771 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:27Z","lastTransitionTime":"2025-11-25T15:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:27 crc kubenswrapper[4743]: E1125 15:59:27.705871 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:27 crc kubenswrapper[4743]: E1125 15:59:27.706012 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.707835 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.707882 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.707891 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.707911 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.707923 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:27Z","lastTransitionTime":"2025-11-25T15:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.774183 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.774183 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:27 crc kubenswrapper[4743]: E1125 15:59:27.774362 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:27 crc kubenswrapper[4743]: E1125 15:59:27.774479 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.810798 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.810867 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.810878 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.810895 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.810906 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:27Z","lastTransitionTime":"2025-11-25T15:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.914769 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.914841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.914851 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.914874 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:27 crc kubenswrapper[4743]: I1125 15:59:27.914885 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:27Z","lastTransitionTime":"2025-11-25T15:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.016488 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.017625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.017699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.017719 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.017749 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.017770 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:28Z","lastTransitionTime":"2025-11-25T15:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.027519 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.034902 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.053871 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.072424 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.089030 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.108547 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.121337 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.121385 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.121396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.121412 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.121422 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:28Z","lastTransitionTime":"2025-11-25T15:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.131766 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.145195 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.159906 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.179015 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"t network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:59:16.804231 6194 services_controller.go:451] Built service openshift-controller-manager/controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager/controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.149\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateM\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.194994 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.208234 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.221393 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.224067 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.224109 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.224121 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.224146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.224160 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:28Z","lastTransitionTime":"2025-11-25T15:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.238880 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.252835 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.266382 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.277553 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.289681 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:28Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.326508 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.326563 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.326576 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.326625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.326640 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:28Z","lastTransitionTime":"2025-11-25T15:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.429156 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.429233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.429245 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.429260 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.429269 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:28Z","lastTransitionTime":"2025-11-25T15:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.532217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.532267 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.532280 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.532299 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.532313 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:28Z","lastTransitionTime":"2025-11-25T15:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.635828 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.635871 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.635882 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.635900 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.635914 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:28Z","lastTransitionTime":"2025-11-25T15:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.738928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.739000 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.739013 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.739041 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.739057 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:28Z","lastTransitionTime":"2025-11-25T15:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.774876 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.774926 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:28 crc kubenswrapper[4743]: E1125 15:59:28.775090 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:28 crc kubenswrapper[4743]: E1125 15:59:28.775265 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.842867 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.842944 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.842963 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.842990 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.843009 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:28Z","lastTransitionTime":"2025-11-25T15:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.946333 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.946401 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.946418 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.946446 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:28 crc kubenswrapper[4743]: I1125 15:59:28.946462 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:28Z","lastTransitionTime":"2025-11-25T15:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.049441 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.049507 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.049517 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.049545 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.049562 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:29Z","lastTransitionTime":"2025-11-25T15:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.153284 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.153351 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.153364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.153387 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.153404 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:29Z","lastTransitionTime":"2025-11-25T15:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.256111 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.256218 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.256259 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.256297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.256310 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:29Z","lastTransitionTime":"2025-11-25T15:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.359355 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.359421 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.359430 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.359451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.359463 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:29Z","lastTransitionTime":"2025-11-25T15:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.463089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.463161 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.463178 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.463203 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.463223 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:29Z","lastTransitionTime":"2025-11-25T15:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.566245 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.566302 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.566319 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.566339 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.566353 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:29Z","lastTransitionTime":"2025-11-25T15:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.668766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.668815 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.668825 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.668842 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.668853 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:29Z","lastTransitionTime":"2025-11-25T15:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.771977 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.772046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.772058 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.772081 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.772094 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:29Z","lastTransitionTime":"2025-11-25T15:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.774731 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.774849 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:29 crc kubenswrapper[4743]: E1125 15:59:29.775019 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:29 crc kubenswrapper[4743]: E1125 15:59:29.775131 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.873973 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.874012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.874021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.874035 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.874045 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:29Z","lastTransitionTime":"2025-11-25T15:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.976999 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.977045 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.977056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.977070 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:29 crc kubenswrapper[4743]: I1125 15:59:29.977080 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:29Z","lastTransitionTime":"2025-11-25T15:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.078947 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.078995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.079011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.079028 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.079039 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:30Z","lastTransitionTime":"2025-11-25T15:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.181666 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.181708 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.181727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.181746 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.181755 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:30Z","lastTransitionTime":"2025-11-25T15:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.284418 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.284486 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.284498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.284515 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.284527 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:30Z","lastTransitionTime":"2025-11-25T15:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.387304 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.387358 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.387374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.387392 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.387403 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:30Z","lastTransitionTime":"2025-11-25T15:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.490065 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.490113 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.490124 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.490143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.490158 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:30Z","lastTransitionTime":"2025-11-25T15:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.593192 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.593271 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.593290 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.593318 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.593339 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:30Z","lastTransitionTime":"2025-11-25T15:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.695633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.695661 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.695669 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.695682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.695691 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:30Z","lastTransitionTime":"2025-11-25T15:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.773946 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.774078 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:30 crc kubenswrapper[4743]: E1125 15:59:30.774116 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:30 crc kubenswrapper[4743]: E1125 15:59:30.774236 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.775439 4743 scope.go:117] "RemoveContainer" containerID="4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.798154 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.798188 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.798199 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.798216 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.798227 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:30Z","lastTransitionTime":"2025-11-25T15:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.901570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.901957 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.901969 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.901989 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:30 crc kubenswrapper[4743]: I1125 15:59:30.902002 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:30Z","lastTransitionTime":"2025-11-25T15:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.004646 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.004684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.004693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.004708 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.004728 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:31Z","lastTransitionTime":"2025-11-25T15:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.068434 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovnkube-controller/1.log" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.070802 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerStarted","Data":"c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a"} Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.071363 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.086410 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.100169 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.107363 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.107555 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.107576 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.107628 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.107644 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:31Z","lastTransitionTime":"2025-11-25T15:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.125768 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.138695 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.154890 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.171550 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.183035 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.207975 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"t network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:59:16.804231 6194 services_controller.go:451] Built service openshift-controller-manager/controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager/controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.149\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateM\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.210256 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.210297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.210308 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.210329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.210340 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:31Z","lastTransitionTime":"2025-11-25T15:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.226699 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.241153 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.256800 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.273483 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.286179 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.300196 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.312803 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.312858 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.312872 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.312892 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.312906 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:31Z","lastTransitionTime":"2025-11-25T15:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.314865 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.329688 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.343703 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6629b5da-56a6-4b21-bed8-c6f5ee333837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dbe4f4383f97c10cebc1610dcb8cfada03fef270e471640d8efdfadeed821c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a128eb053d966105d31ed7d9af2d2306eb60abd8f0aa8924e45156535d5a7ada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136e156bb873571a110dbc5090cfcbd1c0eeec8af8f39bb8aea7f0bb369b6389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.357173 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.415529 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.415576 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.415606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.415621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.415633 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:31Z","lastTransitionTime":"2025-11-25T15:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.519001 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.519066 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.519082 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.519109 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.519128 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:31Z","lastTransitionTime":"2025-11-25T15:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.622473 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.622534 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.622545 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.622566 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.622578 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:31Z","lastTransitionTime":"2025-11-25T15:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.726065 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.726168 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.726189 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.726218 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.726240 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:31Z","lastTransitionTime":"2025-11-25T15:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.774135 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.774130 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:31 crc kubenswrapper[4743]: E1125 15:59:31.774295 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:31 crc kubenswrapper[4743]: E1125 15:59:31.774442 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.793128 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.807992 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.829644 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.829715 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.829734 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.829756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.829770 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:31Z","lastTransitionTime":"2025-11-25T15:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.831141 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.847734 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.861690 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.873730 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.887841 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.908927 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6629b5da-56a6-4b21-bed8-c6f5ee333837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dbe4f4383f97c10cebc1610dcb8cfada03fef270e471640d8efdfadeed821c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a128eb053d966105d31ed7d9af2d2306eb60abd8f0aa8924e45156535d5a7ada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136e156bb873571a110dbc5090cfcbd1c0eeec8af8f39bb8aea7f0bb369b6389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.923614 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.932773 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.932936 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.933042 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.933117 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.933223 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:31Z","lastTransitionTime":"2025-11-25T15:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.935876 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.950902 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.967690 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.982263 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:31 crc kubenswrapper[4743]: I1125 15:59:31.997991 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.018327 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.036468 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.036506 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.036514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.036528 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.036538 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:32Z","lastTransitionTime":"2025-11-25T15:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.037293 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.050717 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.073988 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"t network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:59:16.804231 6194 services_controller.go:451] Built service openshift-controller-manager/controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager/controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.149\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateM\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.076101 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovnkube-controller/2.log" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.076960 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovnkube-controller/1.log" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.079462 4743 generic.go:334] "Generic (PLEG): container finished" podID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerID="c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a" exitCode=1 Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.079507 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerDied","Data":"c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a"} Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.079548 4743 scope.go:117] "RemoveContainer" containerID="4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.080294 4743 scope.go:117] "RemoveContainer" containerID="c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a" Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.080510 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.099352 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6629b5da-56a6-4b21-bed8-c6f5ee333837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dbe4f4383f97c10cebc1610dcb8cfada03fef270e471640d8efdfadeed821c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a128eb053d966105d31ed7d9af2d2306eb60abd8f0aa8924e45156535d5a7ada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136e156bb873571a110dbc5090cfcbd1c0eeec8af8f39bb8aea7f0bb369b6389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.114154 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.129721 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.138680 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.138713 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.138721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.138737 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.138749 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:32Z","lastTransitionTime":"2025-11-25T15:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.146879 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.163812 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.177607 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.194408 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.211278 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.231796 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.241439 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.241496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.241505 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.241522 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.241532 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:32Z","lastTransitionTime":"2025-11-25T15:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.257620 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3457db30219eea56eb5b8b55cc637fec51a78dc6202665b80fdf0ba3c0baed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:17Z\\\",\\\"message\\\":\\\"t network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:16Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:59:16.804231 6194 services_controller.go:451] Built service openshift-controller-manager/controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager/controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager/controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.149\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateM\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:59:31.500475 6396 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 15:59:31.500500 6396 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 15:59:31.500535 6396 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:59:31.500570 6396 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:59:31.500623 6396 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:59:31.500631 6396 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:59:31.500649 6396 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:59:31.500666 6396 factory.go:656] Stopping watch factory\\\\nI1125 15:59:31.500684 6396 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 15:59:31.500695 6396 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:59:31.500702 6396 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:59:31.500708 6396 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:59:31.500717 6396 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:59:31.500724 6396 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:59:31.500731 6396 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.280339 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.297672 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.313662 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.327660 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.340646 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.343453 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.343479 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.343488 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.343501 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.343509 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:32Z","lastTransitionTime":"2025-11-25T15:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.352112 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.365819 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.382738 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:32Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.446649 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.446705 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.446719 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.446739 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.446751 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:32Z","lastTransitionTime":"2025-11-25T15:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.471168 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.471329 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:00:04.471303628 +0000 UTC m=+83.593143187 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.549918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.549993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.550009 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.550035 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.550058 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:32Z","lastTransitionTime":"2025-11-25T15:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.572371 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.572428 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.572461 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.572490 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.572558 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.572655 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.572622 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.572729 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 16:00:04.572702239 +0000 UTC m=+83.694541788 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.572734 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.572793 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 16:00:04.572746971 +0000 UTC m=+83.694586510 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.572815 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.572805 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.572871 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 16:00:04.572852424 +0000 UTC m=+83.694692073 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.572878 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.572897 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.572968 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 16:00:04.572940207 +0000 UTC m=+83.694779756 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.653004 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.653043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.653052 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.653068 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.653078 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:32Z","lastTransitionTime":"2025-11-25T15:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.756795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.756858 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.756870 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.756908 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.756922 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:32Z","lastTransitionTime":"2025-11-25T15:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.774279 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.774304 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.774440 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:32 crc kubenswrapper[4743]: E1125 15:59:32.774635 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.859562 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.859634 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.859650 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.859669 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.859681 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:32Z","lastTransitionTime":"2025-11-25T15:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.963804 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.963867 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.963880 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.963896 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:32 crc kubenswrapper[4743]: I1125 15:59:32.963910 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:32Z","lastTransitionTime":"2025-11-25T15:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.068508 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.068635 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.068649 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.068672 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.068691 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:33Z","lastTransitionTime":"2025-11-25T15:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.085112 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovnkube-controller/2.log" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.089850 4743 scope.go:117] "RemoveContainer" containerID="c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a" Nov 25 15:59:33 crc kubenswrapper[4743]: E1125 15:59:33.090033 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.106244 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.123959 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.137828 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.156455 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.171815 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.171879 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.171890 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.172228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.172247 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:33Z","lastTransitionTime":"2025-11-25T15:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.172833 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.190461 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.217422 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:59:31.500475 6396 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 15:59:31.500500 6396 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 15:59:31.500535 6396 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:59:31.500570 6396 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:59:31.500623 6396 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:59:31.500631 6396 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:59:31.500649 6396 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:59:31.500666 6396 factory.go:656] Stopping watch factory\\\\nI1125 15:59:31.500684 6396 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 15:59:31.500695 6396 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:59:31.500702 6396 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:59:31.500708 6396 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:59:31.500717 6396 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:59:31.500724 6396 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:59:31.500731 6396 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.247902 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.264024 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.275643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.275715 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.275732 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.275798 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.275830 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:33Z","lastTransitionTime":"2025-11-25T15:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.280903 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.298540 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.312673 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.326002 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.341282 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.358990 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.376085 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.378841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.378893 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.378907 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.378928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.378940 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:33Z","lastTransitionTime":"2025-11-25T15:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.391189 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.406913 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6629b5da-56a6-4b21-bed8-c6f5ee333837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dbe4f4383f97c10cebc1610dcb8cfada03fef270e471640d8efdfadeed821c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a128eb053d966105d31ed7d9af2d2306eb60abd8f0aa8924e45156535d5a7ada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136e156bb873571a110dbc5090cfcbd1c0eeec8af8f39bb8aea7f0bb369b6389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:33Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.481921 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.481985 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.481993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.482007 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.482017 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:33Z","lastTransitionTime":"2025-11-25T15:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.585435 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.585502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.585520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.585548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.585572 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:33Z","lastTransitionTime":"2025-11-25T15:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.688609 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.688669 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.688682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.688698 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.688708 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:33Z","lastTransitionTime":"2025-11-25T15:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.774524 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:33 crc kubenswrapper[4743]: E1125 15:59:33.774713 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.774778 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:33 crc kubenswrapper[4743]: E1125 15:59:33.775005 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.791101 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.791160 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.791174 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.791197 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.791215 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:33Z","lastTransitionTime":"2025-11-25T15:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.896519 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.896716 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.896745 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.896812 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.896835 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:33Z","lastTransitionTime":"2025-11-25T15:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.999784 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.999848 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:33 crc kubenswrapper[4743]: I1125 15:59:33.999859 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:33.999891 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:33.999907 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:33Z","lastTransitionTime":"2025-11-25T15:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.102524 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.102580 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.102617 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.102682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.102703 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:34Z","lastTransitionTime":"2025-11-25T15:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.205578 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.205678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.205692 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.205720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.205734 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:34Z","lastTransitionTime":"2025-11-25T15:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.308896 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.308950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.308962 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.308978 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.308992 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:34Z","lastTransitionTime":"2025-11-25T15:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.412495 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.412553 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.412568 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.412623 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.412644 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:34Z","lastTransitionTime":"2025-11-25T15:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.516353 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.516403 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.516414 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.516434 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.516446 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:34Z","lastTransitionTime":"2025-11-25T15:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.619809 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.619897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.619918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.619944 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.619959 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:34Z","lastTransitionTime":"2025-11-25T15:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.696409 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs\") pod \"network-metrics-daemon-s9t79\" (UID: \"617512f9-f767-4615-a9d2-132c6c73a69d\") " pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:34 crc kubenswrapper[4743]: E1125 15:59:34.697300 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:59:34 crc kubenswrapper[4743]: E1125 15:59:34.697478 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs podName:617512f9-f767-4615-a9d2-132c6c73a69d nodeName:}" failed. No retries permitted until 2025-11-25 15:59:50.697455299 +0000 UTC m=+69.819294848 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs") pod "network-metrics-daemon-s9t79" (UID: "617512f9-f767-4615-a9d2-132c6c73a69d") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.730754 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.730804 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.730813 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.730835 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.730853 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:34Z","lastTransitionTime":"2025-11-25T15:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.774361 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:34 crc kubenswrapper[4743]: E1125 15:59:34.774533 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.774948 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:34 crc kubenswrapper[4743]: E1125 15:59:34.775153 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.834290 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.834349 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.834361 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.834381 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.834393 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:34Z","lastTransitionTime":"2025-11-25T15:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.937172 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.937251 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.937266 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.937289 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:34 crc kubenswrapper[4743]: I1125 15:59:34.937308 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:34Z","lastTransitionTime":"2025-11-25T15:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.040242 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.040297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.040311 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.040374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.040390 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:35Z","lastTransitionTime":"2025-11-25T15:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.158516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.158558 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.158567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.158583 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.158616 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:35Z","lastTransitionTime":"2025-11-25T15:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.261863 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.261973 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.261998 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.262032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.262053 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:35Z","lastTransitionTime":"2025-11-25T15:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.365733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.365791 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.365804 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.365822 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.365836 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:35Z","lastTransitionTime":"2025-11-25T15:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.468805 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.468867 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.468880 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.468903 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.468916 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:35Z","lastTransitionTime":"2025-11-25T15:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.573488 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.573578 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.573658 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.573700 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.573721 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:35Z","lastTransitionTime":"2025-11-25T15:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.676672 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.676753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.676775 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.676795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.676807 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:35Z","lastTransitionTime":"2025-11-25T15:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.774755 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.774756 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:35 crc kubenswrapper[4743]: E1125 15:59:35.775017 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:35 crc kubenswrapper[4743]: E1125 15:59:35.775116 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.779521 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.779562 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.779580 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.779645 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.779673 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:35Z","lastTransitionTime":"2025-11-25T15:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.882809 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.882858 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.882868 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.882883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.882894 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:35Z","lastTransitionTime":"2025-11-25T15:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.986368 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.986406 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.986415 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.986431 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:35 crc kubenswrapper[4743]: I1125 15:59:35.986441 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:35Z","lastTransitionTime":"2025-11-25T15:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.090638 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.090700 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.090711 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.090729 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.090739 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:36Z","lastTransitionTime":"2025-11-25T15:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.197949 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.198013 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.198022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.198041 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.198055 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:36Z","lastTransitionTime":"2025-11-25T15:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.301205 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.301266 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.301275 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.301297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.301309 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:36Z","lastTransitionTime":"2025-11-25T15:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.404197 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.404282 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.404302 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.404333 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.404354 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:36Z","lastTransitionTime":"2025-11-25T15:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.507935 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.507988 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.508003 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.508030 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.508047 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:36Z","lastTransitionTime":"2025-11-25T15:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.611327 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.611384 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.611396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.611418 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.611432 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:36Z","lastTransitionTime":"2025-11-25T15:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.714836 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.714923 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.714946 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.714979 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.715006 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:36Z","lastTransitionTime":"2025-11-25T15:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.774546 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.774635 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:36 crc kubenswrapper[4743]: E1125 15:59:36.774753 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:36 crc kubenswrapper[4743]: E1125 15:59:36.774992 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.818093 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.818161 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.818180 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.818205 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.818223 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:36Z","lastTransitionTime":"2025-11-25T15:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.920903 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.920973 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.920988 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.921009 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:36 crc kubenswrapper[4743]: I1125 15:59:36.921021 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:36Z","lastTransitionTime":"2025-11-25T15:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.024009 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.024064 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.024073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.024093 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.024107 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:37Z","lastTransitionTime":"2025-11-25T15:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.126644 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.126693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.126710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.126734 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.126752 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:37Z","lastTransitionTime":"2025-11-25T15:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.229803 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.229851 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.229873 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.229894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.229907 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:37Z","lastTransitionTime":"2025-11-25T15:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.333012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.333074 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.333089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.333109 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.333129 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:37Z","lastTransitionTime":"2025-11-25T15:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.435960 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.436036 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.436055 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.436085 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.436103 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:37Z","lastTransitionTime":"2025-11-25T15:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.539481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.539540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.539576 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.539673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.539696 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:37Z","lastTransitionTime":"2025-11-25T15:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.643386 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.643450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.643466 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.643488 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.643504 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:37Z","lastTransitionTime":"2025-11-25T15:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.747280 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.747332 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.747347 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.747364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.747377 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:37Z","lastTransitionTime":"2025-11-25T15:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.774525 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.774816 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:37 crc kubenswrapper[4743]: E1125 15:59:37.774938 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:37 crc kubenswrapper[4743]: E1125 15:59:37.775170 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.844660 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.844747 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.844768 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.844796 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.844817 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:37Z","lastTransitionTime":"2025-11-25T15:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:37 crc kubenswrapper[4743]: E1125 15:59:37.866948 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.871782 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.871841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.871856 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.871879 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.871893 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:37Z","lastTransitionTime":"2025-11-25T15:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:37 crc kubenswrapper[4743]: E1125 15:59:37.888004 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.892942 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.892992 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.893002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.893020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.893031 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:37Z","lastTransitionTime":"2025-11-25T15:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:37 crc kubenswrapper[4743]: E1125 15:59:37.905808 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.910424 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.910480 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.910493 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.910515 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.910531 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:37Z","lastTransitionTime":"2025-11-25T15:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:37 crc kubenswrapper[4743]: E1125 15:59:37.924654 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.930654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.930814 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.930897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.930968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.931044 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:37Z","lastTransitionTime":"2025-11-25T15:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:37 crc kubenswrapper[4743]: E1125 15:59:37.950763 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:37 crc kubenswrapper[4743]: E1125 15:59:37.950928 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.953206 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.953334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.953407 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.953488 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:37 crc kubenswrapper[4743]: I1125 15:59:37.953560 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:37Z","lastTransitionTime":"2025-11-25T15:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.057329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.057380 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.057390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.057416 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.057430 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:38Z","lastTransitionTime":"2025-11-25T15:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.160736 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.160895 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.160914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.161390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.161404 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:38Z","lastTransitionTime":"2025-11-25T15:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.264625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.264662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.264670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.264685 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.264694 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:38Z","lastTransitionTime":"2025-11-25T15:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.368169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.368447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.368486 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.368544 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.368577 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:38Z","lastTransitionTime":"2025-11-25T15:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.472207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.472261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.472275 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.472293 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.472308 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:38Z","lastTransitionTime":"2025-11-25T15:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.576170 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.576216 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.576227 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.576244 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.576256 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:38Z","lastTransitionTime":"2025-11-25T15:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.678959 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.679031 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.679041 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.679056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.679066 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:38Z","lastTransitionTime":"2025-11-25T15:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.774494 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.774494 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:38 crc kubenswrapper[4743]: E1125 15:59:38.774691 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:38 crc kubenswrapper[4743]: E1125 15:59:38.774737 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.782274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.782367 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.782397 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.782428 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.782448 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:38Z","lastTransitionTime":"2025-11-25T15:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.885447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.885806 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.885951 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.886029 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.886103 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:38Z","lastTransitionTime":"2025-11-25T15:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.989454 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.989743 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.989821 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.989949 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:38 crc kubenswrapper[4743]: I1125 15:59:38.990025 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:38Z","lastTransitionTime":"2025-11-25T15:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.093818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.093889 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.093906 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.093934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.093953 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:39Z","lastTransitionTime":"2025-11-25T15:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.196753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.196814 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.196828 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.196860 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.196877 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:39Z","lastTransitionTime":"2025-11-25T15:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.299814 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.299864 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.299877 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.299896 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.299909 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:39Z","lastTransitionTime":"2025-11-25T15:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.403387 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.403489 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.403510 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.403535 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.403560 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:39Z","lastTransitionTime":"2025-11-25T15:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.507290 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.507358 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.507377 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.507430 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.507455 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:39Z","lastTransitionTime":"2025-11-25T15:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.610626 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.610670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.610683 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.610703 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.610716 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:39Z","lastTransitionTime":"2025-11-25T15:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.714051 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.714117 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.714133 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.714159 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.714178 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:39Z","lastTransitionTime":"2025-11-25T15:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.774254 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.774272 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:39 crc kubenswrapper[4743]: E1125 15:59:39.774461 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:39 crc kubenswrapper[4743]: E1125 15:59:39.774573 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.817428 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.817487 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.817497 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.817516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.817528 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:39Z","lastTransitionTime":"2025-11-25T15:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.920687 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.920728 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.920741 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.920760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:39 crc kubenswrapper[4743]: I1125 15:59:39.920773 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:39Z","lastTransitionTime":"2025-11-25T15:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.023977 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.024021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.024031 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.024057 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.024074 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:40Z","lastTransitionTime":"2025-11-25T15:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.128186 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.128233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.128244 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.128269 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.128282 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:40Z","lastTransitionTime":"2025-11-25T15:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.232336 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.232397 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.232415 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.232438 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.232456 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:40Z","lastTransitionTime":"2025-11-25T15:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.335542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.335670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.335689 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.335715 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.335733 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:40Z","lastTransitionTime":"2025-11-25T15:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.438769 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.438813 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.438821 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.438838 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.438850 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:40Z","lastTransitionTime":"2025-11-25T15:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.542630 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.542688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.542701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.542722 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.542737 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:40Z","lastTransitionTime":"2025-11-25T15:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.646847 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.646985 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.647013 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.647050 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.647077 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:40Z","lastTransitionTime":"2025-11-25T15:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.750645 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.750715 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.750735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.750758 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.750782 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:40Z","lastTransitionTime":"2025-11-25T15:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.774936 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:40 crc kubenswrapper[4743]: E1125 15:59:40.775133 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.775585 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:40 crc kubenswrapper[4743]: E1125 15:59:40.775894 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.853862 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.853938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.853948 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.853966 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.853976 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:40Z","lastTransitionTime":"2025-11-25T15:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.957206 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.957260 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.957270 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.957288 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:40 crc kubenswrapper[4743]: I1125 15:59:40.957299 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:40Z","lastTransitionTime":"2025-11-25T15:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.060398 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.060738 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.060758 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.060784 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.060802 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:41Z","lastTransitionTime":"2025-11-25T15:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.163257 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.163312 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.163328 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.163357 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.163371 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:41Z","lastTransitionTime":"2025-11-25T15:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.266346 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.266393 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.266404 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.266422 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.266438 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:41Z","lastTransitionTime":"2025-11-25T15:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.369651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.369705 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.369716 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.369737 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.369751 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:41Z","lastTransitionTime":"2025-11-25T15:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.473123 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.473465 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.473719 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.473959 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.474170 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:41Z","lastTransitionTime":"2025-11-25T15:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.576906 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.576945 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.576956 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.576974 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.576988 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:41Z","lastTransitionTime":"2025-11-25T15:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.680720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.680818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.680840 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.680874 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.680892 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:41Z","lastTransitionTime":"2025-11-25T15:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.774764 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:41 crc kubenswrapper[4743]: E1125 15:59:41.774974 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.775080 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:41 crc kubenswrapper[4743]: E1125 15:59:41.775163 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.783185 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.783471 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.783482 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.783498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.783512 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:41Z","lastTransitionTime":"2025-11-25T15:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.791877 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.802995 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.819722 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.836119 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.852007 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.870712 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.887233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.887279 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.887295 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.887318 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.887334 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:41Z","lastTransitionTime":"2025-11-25T15:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.889157 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.906489 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.938954 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:59:31.500475 6396 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 15:59:31.500500 6396 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 15:59:31.500535 6396 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:59:31.500570 6396 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:59:31.500623 6396 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:59:31.500631 6396 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:59:31.500649 6396 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:59:31.500666 6396 factory.go:656] Stopping watch factory\\\\nI1125 15:59:31.500684 6396 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 15:59:31.500695 6396 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:59:31.500702 6396 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:59:31.500708 6396 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:59:31.500717 6396 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:59:31.500724 6396 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:59:31.500731 6396 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.951844 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.963153 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.977733 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.990076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.990134 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.990150 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.990618 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.990654 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:41Z","lastTransitionTime":"2025-11-25T15:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:41 crc kubenswrapper[4743]: I1125 15:59:41.990665 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.001863 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.012658 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.023848 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.035405 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6629b5da-56a6-4b21-bed8-c6f5ee333837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dbe4f4383f97c10cebc1610dcb8cfada03fef270e471640d8efdfadeed821c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a128eb053d966105d31ed7d9af2d2306eb60abd8f0aa8924e45156535d5a7ada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136e156bb873571a110dbc5090cfcbd1c0eeec8af8f39bb8aea7f0bb369b6389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.047197 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.093432 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.093477 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.093489 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.093511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.093528 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:42Z","lastTransitionTime":"2025-11-25T15:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.196396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.196470 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.196483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.196499 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.196508 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:42Z","lastTransitionTime":"2025-11-25T15:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.300055 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.300106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.300116 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.300134 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.300148 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:42Z","lastTransitionTime":"2025-11-25T15:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.403995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.404074 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.404094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.404121 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.404143 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:42Z","lastTransitionTime":"2025-11-25T15:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.507999 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.508054 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.508065 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.508085 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.508099 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:42Z","lastTransitionTime":"2025-11-25T15:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.611775 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.611834 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.611845 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.611868 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.611891 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:42Z","lastTransitionTime":"2025-11-25T15:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.716708 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.716784 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.716824 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.716852 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.716873 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:42Z","lastTransitionTime":"2025-11-25T15:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.774820 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.774820 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:42 crc kubenswrapper[4743]: E1125 15:59:42.775078 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:42 crc kubenswrapper[4743]: E1125 15:59:42.775296 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.820222 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.820295 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.820314 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.820345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.820368 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:42Z","lastTransitionTime":"2025-11-25T15:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.924947 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.925014 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.925024 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.925045 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:42 crc kubenswrapper[4743]: I1125 15:59:42.925060 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:42Z","lastTransitionTime":"2025-11-25T15:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.029070 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.029146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.029166 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.029195 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.029215 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:43Z","lastTransitionTime":"2025-11-25T15:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.132738 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.132791 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.132805 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.132825 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.132838 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:43Z","lastTransitionTime":"2025-11-25T15:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.236541 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.236628 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.236640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.236660 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.236671 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:43Z","lastTransitionTime":"2025-11-25T15:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.339492 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.339548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.339558 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.339577 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.339609 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:43Z","lastTransitionTime":"2025-11-25T15:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.442970 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.443034 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.443052 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.443079 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.443096 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:43Z","lastTransitionTime":"2025-11-25T15:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.546086 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.546632 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.546677 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.546700 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.546721 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:43Z","lastTransitionTime":"2025-11-25T15:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.648897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.648946 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.648961 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.648981 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.648994 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:43Z","lastTransitionTime":"2025-11-25T15:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.752709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.752793 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.752817 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.752844 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.752861 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:43Z","lastTransitionTime":"2025-11-25T15:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.774270 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.774271 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:43 crc kubenswrapper[4743]: E1125 15:59:43.774427 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:43 crc kubenswrapper[4743]: E1125 15:59:43.774576 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.855769 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.855826 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.855846 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.855869 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.855884 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:43Z","lastTransitionTime":"2025-11-25T15:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.959101 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.959177 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.959201 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.959237 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:43 crc kubenswrapper[4743]: I1125 15:59:43.959260 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:43Z","lastTransitionTime":"2025-11-25T15:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.062109 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.062154 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.062163 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.062183 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.062220 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:44Z","lastTransitionTime":"2025-11-25T15:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.166254 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.166381 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.166404 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.166434 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.166456 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:44Z","lastTransitionTime":"2025-11-25T15:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.270553 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.270612 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.270627 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.270644 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.270655 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:44Z","lastTransitionTime":"2025-11-25T15:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.374038 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.374120 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.374139 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.374169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.374188 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:44Z","lastTransitionTime":"2025-11-25T15:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.477748 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.477795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.477813 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.477836 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.477853 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:44Z","lastTransitionTime":"2025-11-25T15:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.583188 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.583240 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.583261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.583293 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.583314 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:44Z","lastTransitionTime":"2025-11-25T15:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.686290 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.686323 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.686331 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.686346 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.686357 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:44Z","lastTransitionTime":"2025-11-25T15:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.775533 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.776483 4743 scope.go:117] "RemoveContainer" containerID="c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a" Nov 25 15:59:44 crc kubenswrapper[4743]: E1125 15:59:44.776828 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.777063 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:44 crc kubenswrapper[4743]: E1125 15:59:44.777156 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:44 crc kubenswrapper[4743]: E1125 15:59:44.777618 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.790074 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.790108 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.790116 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.790133 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.790145 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:44Z","lastTransitionTime":"2025-11-25T15:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.896526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.896580 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.896632 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.896711 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:44 crc kubenswrapper[4743]: I1125 15:59:44.896743 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:44Z","lastTransitionTime":"2025-11-25T15:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.000565 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.000743 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.000780 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.000811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.000831 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:45Z","lastTransitionTime":"2025-11-25T15:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.104323 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.104374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.104387 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.104407 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.104418 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:45Z","lastTransitionTime":"2025-11-25T15:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.207137 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.207196 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.207208 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.207230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.207243 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:45Z","lastTransitionTime":"2025-11-25T15:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.310187 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.310238 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.310249 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.310277 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.310291 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:45Z","lastTransitionTime":"2025-11-25T15:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.413499 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.413712 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.413728 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.413771 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.413791 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:45Z","lastTransitionTime":"2025-11-25T15:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.517279 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.517329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.517341 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.517362 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.517379 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:45Z","lastTransitionTime":"2025-11-25T15:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.620255 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.620303 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.620315 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.620331 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.620343 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:45Z","lastTransitionTime":"2025-11-25T15:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.723315 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.723396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.723413 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.723436 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.723452 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:45Z","lastTransitionTime":"2025-11-25T15:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.774459 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.774545 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:45 crc kubenswrapper[4743]: E1125 15:59:45.774777 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:45 crc kubenswrapper[4743]: E1125 15:59:45.775186 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.826582 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.826674 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.826693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.826720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.826739 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:45Z","lastTransitionTime":"2025-11-25T15:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.929447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.929498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.929510 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.929530 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:45 crc kubenswrapper[4743]: I1125 15:59:45.929542 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:45Z","lastTransitionTime":"2025-11-25T15:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.031841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.031884 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.031893 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.031909 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.031919 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:46Z","lastTransitionTime":"2025-11-25T15:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.134829 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.134894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.134905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.134928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.134942 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:46Z","lastTransitionTime":"2025-11-25T15:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.238448 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.238495 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.238505 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.238522 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.238537 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:46Z","lastTransitionTime":"2025-11-25T15:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.341005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.341047 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.341057 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.341073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.341083 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:46Z","lastTransitionTime":"2025-11-25T15:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.444356 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.444397 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.444406 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.444424 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.444436 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:46Z","lastTransitionTime":"2025-11-25T15:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.547270 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.547320 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.547331 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.547359 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.547375 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:46Z","lastTransitionTime":"2025-11-25T15:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.650833 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.650897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.650910 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.650931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.650943 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:46Z","lastTransitionTime":"2025-11-25T15:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.753607 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.753653 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.753666 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.753684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.753700 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:46Z","lastTransitionTime":"2025-11-25T15:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.774239 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.774302 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:46 crc kubenswrapper[4743]: E1125 15:59:46.774468 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:46 crc kubenswrapper[4743]: E1125 15:59:46.774889 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.856700 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.856749 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.856759 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.856777 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.856790 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:46Z","lastTransitionTime":"2025-11-25T15:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.960131 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.960186 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.960197 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.960215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:46 crc kubenswrapper[4743]: I1125 15:59:46.960251 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:46Z","lastTransitionTime":"2025-11-25T15:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.062914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.062971 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.062988 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.063011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.063027 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:47Z","lastTransitionTime":"2025-11-25T15:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.165937 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.165993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.166004 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.166026 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.166038 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:47Z","lastTransitionTime":"2025-11-25T15:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.268101 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.268140 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.268149 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.268166 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.268177 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:47Z","lastTransitionTime":"2025-11-25T15:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.371184 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.371244 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.371257 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.371282 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.371301 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:47Z","lastTransitionTime":"2025-11-25T15:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.474343 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.474411 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.474429 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.474457 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.474479 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:47Z","lastTransitionTime":"2025-11-25T15:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.577706 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.577783 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.577810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.577845 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.577871 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:47Z","lastTransitionTime":"2025-11-25T15:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.680810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.680909 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.680927 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.680954 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.680974 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:47Z","lastTransitionTime":"2025-11-25T15:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.774860 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.774919 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:47 crc kubenswrapper[4743]: E1125 15:59:47.775104 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:47 crc kubenswrapper[4743]: E1125 15:59:47.775263 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.784093 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.784138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.784150 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.784163 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.784174 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:47Z","lastTransitionTime":"2025-11-25T15:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.886858 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.886898 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.886909 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.886929 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.886942 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:47Z","lastTransitionTime":"2025-11-25T15:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.989647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.989691 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.989705 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.989723 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:47 crc kubenswrapper[4743]: I1125 15:59:47.989735 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:47Z","lastTransitionTime":"2025-11-25T15:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.092741 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.092783 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.092795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.092814 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.092827 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:48Z","lastTransitionTime":"2025-11-25T15:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.195823 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.195887 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.195903 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.195926 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.195945 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:48Z","lastTransitionTime":"2025-11-25T15:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.263532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.263617 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.263633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.263656 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.263671 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:48Z","lastTransitionTime":"2025-11-25T15:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:48 crc kubenswrapper[4743]: E1125 15:59:48.278581 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.283492 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.283550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.283563 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.283606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.283621 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:48Z","lastTransitionTime":"2025-11-25T15:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:48 crc kubenswrapper[4743]: E1125 15:59:48.300737 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.306484 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.306546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.306561 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.306632 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.306647 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:48Z","lastTransitionTime":"2025-11-25T15:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:48 crc kubenswrapper[4743]: E1125 15:59:48.320912 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.325551 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.325625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.325640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.325662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.325675 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:48Z","lastTransitionTime":"2025-11-25T15:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:48 crc kubenswrapper[4743]: E1125 15:59:48.342107 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.347263 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.347369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.347384 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.347428 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.347444 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:48Z","lastTransitionTime":"2025-11-25T15:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:48 crc kubenswrapper[4743]: E1125 15:59:48.360856 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:48Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:48 crc kubenswrapper[4743]: E1125 15:59:48.360988 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.363169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.363203 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.363213 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.363229 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.363241 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:48Z","lastTransitionTime":"2025-11-25T15:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.466485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.466533 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.466546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.466565 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.466578 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:48Z","lastTransitionTime":"2025-11-25T15:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.569403 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.569463 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.569477 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.569498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.569510 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:48Z","lastTransitionTime":"2025-11-25T15:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.672751 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.672813 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.672833 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.672862 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.672877 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:48Z","lastTransitionTime":"2025-11-25T15:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.774148 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.774255 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:48 crc kubenswrapper[4743]: E1125 15:59:48.774315 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:48 crc kubenswrapper[4743]: E1125 15:59:48.774513 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.775947 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.775988 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.776001 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.776016 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.776040 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:48Z","lastTransitionTime":"2025-11-25T15:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.879654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.879713 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.879728 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.879755 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.879770 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:48Z","lastTransitionTime":"2025-11-25T15:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.982157 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.982211 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.982221 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.982239 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:48 crc kubenswrapper[4743]: I1125 15:59:48.982253 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:48Z","lastTransitionTime":"2025-11-25T15:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.084854 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.084915 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.084931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.084962 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.084991 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:49Z","lastTransitionTime":"2025-11-25T15:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.188230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.188301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.188313 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.188332 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.188345 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:49Z","lastTransitionTime":"2025-11-25T15:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.291567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.291652 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.291664 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.291690 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.291707 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:49Z","lastTransitionTime":"2025-11-25T15:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.394437 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.394479 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.394489 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.394510 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.394521 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:49Z","lastTransitionTime":"2025-11-25T15:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.497678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.497748 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.497763 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.497783 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.497798 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:49Z","lastTransitionTime":"2025-11-25T15:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.600485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.600540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.600550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.600566 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.600578 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:49Z","lastTransitionTime":"2025-11-25T15:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.704510 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.704572 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.704585 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.704622 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.704637 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:49Z","lastTransitionTime":"2025-11-25T15:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.780431 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:49 crc kubenswrapper[4743]: E1125 15:59:49.780681 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.781024 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:49 crc kubenswrapper[4743]: E1125 15:59:49.781183 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.807770 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.807844 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.807865 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.807894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.807911 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:49Z","lastTransitionTime":"2025-11-25T15:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.910634 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.910715 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.910741 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.910777 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:49 crc kubenswrapper[4743]: I1125 15:59:49.910802 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:49Z","lastTransitionTime":"2025-11-25T15:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.013939 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.013993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.014007 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.014025 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.014040 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:50Z","lastTransitionTime":"2025-11-25T15:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.117602 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.117658 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.117668 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.117688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.117702 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:50Z","lastTransitionTime":"2025-11-25T15:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.221240 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.221314 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.221342 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.221383 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.221644 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:50Z","lastTransitionTime":"2025-11-25T15:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.324695 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.324756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.324769 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.324791 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.324812 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:50Z","lastTransitionTime":"2025-11-25T15:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.433872 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.433937 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.433951 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.433976 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.433993 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:50Z","lastTransitionTime":"2025-11-25T15:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.537146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.537197 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.537212 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.537236 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.537252 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:50Z","lastTransitionTime":"2025-11-25T15:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.715844 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs\") pod \"network-metrics-daemon-s9t79\" (UID: \"617512f9-f767-4615-a9d2-132c6c73a69d\") " pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:50 crc kubenswrapper[4743]: E1125 15:59:50.716014 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:59:50 crc kubenswrapper[4743]: E1125 15:59:50.716082 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs podName:617512f9-f767-4615-a9d2-132c6c73a69d nodeName:}" failed. No retries permitted until 2025-11-25 16:00:22.716061281 +0000 UTC m=+101.837900830 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs") pod "network-metrics-daemon-s9t79" (UID: "617512f9-f767-4615-a9d2-132c6c73a69d") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.717025 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.717065 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.717078 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.717100 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.717110 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:50Z","lastTransitionTime":"2025-11-25T15:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.774800 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.774881 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:50 crc kubenswrapper[4743]: E1125 15:59:50.774958 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:50 crc kubenswrapper[4743]: E1125 15:59:50.775047 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.819904 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.819959 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.819974 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.820001 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.820020 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:50Z","lastTransitionTime":"2025-11-25T15:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.922938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.922992 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.923005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.923027 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:50 crc kubenswrapper[4743]: I1125 15:59:50.923042 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:50Z","lastTransitionTime":"2025-11-25T15:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.026191 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.026256 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.026267 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.026285 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.026314 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:51Z","lastTransitionTime":"2025-11-25T15:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.129653 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.129691 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.129706 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.129724 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.129737 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:51Z","lastTransitionTime":"2025-11-25T15:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.232713 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.232853 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.232874 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.232902 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.232923 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:51Z","lastTransitionTime":"2025-11-25T15:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.335832 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.335897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.335909 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.335946 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.335961 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:51Z","lastTransitionTime":"2025-11-25T15:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.439010 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.439063 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.439075 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.439100 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.439114 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:51Z","lastTransitionTime":"2025-11-25T15:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.541618 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.541660 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.541670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.541686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.541698 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:51Z","lastTransitionTime":"2025-11-25T15:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.644893 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.645212 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.645274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.645342 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.645413 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:51Z","lastTransitionTime":"2025-11-25T15:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.748302 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.748667 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.748779 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.748890 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.748990 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:51Z","lastTransitionTime":"2025-11-25T15:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.774040 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.774053 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:51 crc kubenswrapper[4743]: E1125 15:59:51.774570 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:51 crc kubenswrapper[4743]: E1125 15:59:51.774783 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.788077 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.798851 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.812976 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.834916 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.847775 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.851879 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.851921 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.851932 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.851950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.851965 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:51Z","lastTransitionTime":"2025-11-25T15:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.871269 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.886294 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.902136 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.923519 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:59:31.500475 6396 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 15:59:31.500500 6396 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 15:59:31.500535 6396 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:59:31.500570 6396 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:59:31.500623 6396 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:59:31.500631 6396 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:59:31.500649 6396 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:59:31.500666 6396 factory.go:656] Stopping watch factory\\\\nI1125 15:59:31.500684 6396 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 15:59:31.500695 6396 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:59:31.500702 6396 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:59:31.500708 6396 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:59:31.500717 6396 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:59:31.500724 6396 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:59:31.500731 6396 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.938753 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.953379 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.954600 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.954652 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.954667 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.954690 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.954704 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:51Z","lastTransitionTime":"2025-11-25T15:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.965207 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.977585 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.987709 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:51 crc kubenswrapper[4743]: I1125 15:59:51.998931 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.011006 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.029155 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6629b5da-56a6-4b21-bed8-c6f5ee333837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dbe4f4383f97c10cebc1610dcb8cfada03fef270e471640d8efdfadeed821c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a128eb053d966105d31ed7d9af2d2306eb60abd8f0aa8924e45156535d5a7ada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136e156bb873571a110dbc5090cfcbd1c0eeec8af8f39bb8aea7f0bb369b6389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.041225 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.057521 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.057582 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.057611 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.057630 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.057645 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:52Z","lastTransitionTime":"2025-11-25T15:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.160576 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.160650 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.160665 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.160684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.160694 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:52Z","lastTransitionTime":"2025-11-25T15:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.263939 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.264005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.264019 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.264046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.264062 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:52Z","lastTransitionTime":"2025-11-25T15:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.367036 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.367078 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.367088 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.367154 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.367166 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:52Z","lastTransitionTime":"2025-11-25T15:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.469872 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.469943 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.469956 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.469993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.470004 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:52Z","lastTransitionTime":"2025-11-25T15:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.572394 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.572472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.572485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.572505 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.572522 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:52Z","lastTransitionTime":"2025-11-25T15:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.675458 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.675522 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.675539 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.675563 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.675578 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:52Z","lastTransitionTime":"2025-11-25T15:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.774858 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.774877 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:52 crc kubenswrapper[4743]: E1125 15:59:52.775094 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:52 crc kubenswrapper[4743]: E1125 15:59:52.775199 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.777786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.777834 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.777847 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.777868 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.777880 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:52Z","lastTransitionTime":"2025-11-25T15:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.881224 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.881271 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.881281 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.881305 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.881327 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:52Z","lastTransitionTime":"2025-11-25T15:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.984602 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.984659 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.984673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.984691 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:52 crc kubenswrapper[4743]: I1125 15:59:52.984701 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:52Z","lastTransitionTime":"2025-11-25T15:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.087823 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.087863 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.087872 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.087888 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.087899 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:53Z","lastTransitionTime":"2025-11-25T15:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.170934 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n2r2l_2175b34c-5202-4e94-af0e-2f879b98c0bc/kube-multus/0.log" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.170991 4743 generic.go:334] "Generic (PLEG): container finished" podID="2175b34c-5202-4e94-af0e-2f879b98c0bc" containerID="1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e" exitCode=1 Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.171035 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n2r2l" event={"ID":"2175b34c-5202-4e94-af0e-2f879b98c0bc","Type":"ContainerDied","Data":"1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e"} Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.171523 4743 scope.go:117] "RemoveContainer" containerID="1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.187252 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.191930 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.192457 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.192471 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.192488 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.192499 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:53Z","lastTransitionTime":"2025-11-25T15:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.206138 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.228620 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:59:31.500475 6396 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 15:59:31.500500 6396 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 15:59:31.500535 6396 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:59:31.500570 6396 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:59:31.500623 6396 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:59:31.500631 6396 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:59:31.500649 6396 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:59:31.500666 6396 factory.go:656] Stopping watch factory\\\\nI1125 15:59:31.500684 6396 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 15:59:31.500695 6396 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:59:31.500702 6396 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:59:31.500708 6396 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:59:31.500717 6396 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:59:31.500724 6396 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:59:31.500731 6396 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.250122 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.264753 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.276124 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.288332 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.295568 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.295628 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.295652 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.295672 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.295685 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:53Z","lastTransitionTime":"2025-11-25T15:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.301756 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.311551 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.320881 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.340437 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.351726 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6629b5da-56a6-4b21-bed8-c6f5ee333837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dbe4f4383f97c10cebc1610dcb8cfada03fef270e471640d8efdfadeed821c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a128eb053d966105d31ed7d9af2d2306eb60abd8f0aa8924e45156535d5a7ada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136e156bb873571a110dbc5090cfcbd1c0eeec8af8f39bb8aea7f0bb369b6389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.362925 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.371708 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.382907 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:52Z\\\",\\\"message\\\":\\\"2025-11-25T15:59:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_732b0472-8785-498f-ae94-a1cd81379159\\\\n2025-11-25T15:59:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_732b0472-8785-498f-ae94-a1cd81379159 to /host/opt/cni/bin/\\\\n2025-11-25T15:59:07Z [verbose] multus-daemon started\\\\n2025-11-25T15:59:07Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:59:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.395974 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.398022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.398083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.398100 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.398124 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.398138 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:53Z","lastTransitionTime":"2025-11-25T15:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.406759 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.419231 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.501860 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.501903 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.501913 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.501933 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.501945 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:53Z","lastTransitionTime":"2025-11-25T15:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.604200 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.604248 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.604261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.604281 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.604293 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:53Z","lastTransitionTime":"2025-11-25T15:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.707329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.707384 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.707401 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.707425 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.707438 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:53Z","lastTransitionTime":"2025-11-25T15:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.774437 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.774625 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:53 crc kubenswrapper[4743]: E1125 15:59:53.774748 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:53 crc kubenswrapper[4743]: E1125 15:59:53.774968 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.809876 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.809938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.809956 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.809980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.810001 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:53Z","lastTransitionTime":"2025-11-25T15:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.913233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.913296 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.913315 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.913344 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:53 crc kubenswrapper[4743]: I1125 15:59:53.913365 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:53Z","lastTransitionTime":"2025-11-25T15:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.016345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.016388 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.016396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.016415 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.016427 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:54Z","lastTransitionTime":"2025-11-25T15:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.119486 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.119537 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.119548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.119565 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.119578 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:54Z","lastTransitionTime":"2025-11-25T15:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.177968 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n2r2l_2175b34c-5202-4e94-af0e-2f879b98c0bc/kube-multus/0.log" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.178049 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n2r2l" event={"ID":"2175b34c-5202-4e94-af0e-2f879b98c0bc","Type":"ContainerStarted","Data":"47d4adf248256da18201eea949e15ec4471560028e06db3be072d6325667fc19"} Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.193985 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.210896 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.222842 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.222920 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.222948 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.222984 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.223010 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:54Z","lastTransitionTime":"2025-11-25T15:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.230640 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.250939 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.264857 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.282865 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.297381 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.314786 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6629b5da-56a6-4b21-bed8-c6f5ee333837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dbe4f4383f97c10cebc1610dcb8cfada03fef270e471640d8efdfadeed821c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a128eb053d966105d31ed7d9af2d2306eb60abd8f0aa8924e45156535d5a7ada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136e156bb873571a110dbc5090cfcbd1c0eeec8af8f39bb8aea7f0bb369b6389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.326129 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.326193 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.326207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.326237 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.326255 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:54Z","lastTransitionTime":"2025-11-25T15:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.331585 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.347871 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.359905 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.383460 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47d4adf248256da18201eea949e15ec4471560028e06db3be072d6325667fc19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:52Z\\\",\\\"message\\\":\\\"2025-11-25T15:59:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_732b0472-8785-498f-ae94-a1cd81379159\\\\n2025-11-25T15:59:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_732b0472-8785-498f-ae94-a1cd81379159 to /host/opt/cni/bin/\\\\n2025-11-25T15:59:07Z [verbose] multus-daemon started\\\\n2025-11-25T15:59:07Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:59:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.403366 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.414304 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.429527 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.429569 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.429578 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.429610 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.429621 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:54Z","lastTransitionTime":"2025-11-25T15:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.446527 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.463423 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.481952 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.501457 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:59:31.500475 6396 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 15:59:31.500500 6396 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 15:59:31.500535 6396 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:59:31.500570 6396 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:59:31.500623 6396 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:59:31.500631 6396 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:59:31.500649 6396 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:59:31.500666 6396 factory.go:656] Stopping watch factory\\\\nI1125 15:59:31.500684 6396 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 15:59:31.500695 6396 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:59:31.500702 6396 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:59:31.500708 6396 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:59:31.500717 6396 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:59:31.500724 6396 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:59:31.500731 6396 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.532752 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.532801 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.532816 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.532834 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.532847 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:54Z","lastTransitionTime":"2025-11-25T15:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.635273 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.635316 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.635325 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.635344 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.635355 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:54Z","lastTransitionTime":"2025-11-25T15:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.738464 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.738525 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.738544 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.738571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.738629 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:54Z","lastTransitionTime":"2025-11-25T15:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.774128 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.774163 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:54 crc kubenswrapper[4743]: E1125 15:59:54.774366 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:54 crc kubenswrapper[4743]: E1125 15:59:54.774554 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.841714 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.841791 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.841811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.841839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.841855 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:54Z","lastTransitionTime":"2025-11-25T15:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.944460 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.944493 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.944503 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.944519 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:54 crc kubenswrapper[4743]: I1125 15:59:54.944528 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:54Z","lastTransitionTime":"2025-11-25T15:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.047666 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.047711 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.047725 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.047743 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.047755 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:55Z","lastTransitionTime":"2025-11-25T15:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.150637 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.150707 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.150725 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.150753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.150771 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:55Z","lastTransitionTime":"2025-11-25T15:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.253998 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.254085 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.254099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.254121 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.254136 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:55Z","lastTransitionTime":"2025-11-25T15:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.357031 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.357075 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.357085 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.357101 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.357114 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:55Z","lastTransitionTime":"2025-11-25T15:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.460639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.460709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.460721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.460746 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.460762 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:55Z","lastTransitionTime":"2025-11-25T15:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.563353 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.563417 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.563436 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.563463 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.563478 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:55Z","lastTransitionTime":"2025-11-25T15:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.666390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.666445 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.666459 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.666481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.666496 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:55Z","lastTransitionTime":"2025-11-25T15:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.768986 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.769045 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.769064 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.769090 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.769108 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:55Z","lastTransitionTime":"2025-11-25T15:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.774326 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.774340 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:55 crc kubenswrapper[4743]: E1125 15:59:55.774467 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:55 crc kubenswrapper[4743]: E1125 15:59:55.774633 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.872174 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.872225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.872238 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.872254 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.872265 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:55Z","lastTransitionTime":"2025-11-25T15:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.974240 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.974293 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.974309 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.974338 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:55 crc kubenswrapper[4743]: I1125 15:59:55.974354 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:55Z","lastTransitionTime":"2025-11-25T15:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.077478 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.077531 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.077543 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.077565 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.077579 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:56Z","lastTransitionTime":"2025-11-25T15:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.181024 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.181084 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.181097 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.181122 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.181137 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:56Z","lastTransitionTime":"2025-11-25T15:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.284308 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.284357 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.284369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.284388 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.284404 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:56Z","lastTransitionTime":"2025-11-25T15:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.388230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.388286 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.388307 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.388334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.388357 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:56Z","lastTransitionTime":"2025-11-25T15:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.492105 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.492207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.492228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.492260 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.492283 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:56Z","lastTransitionTime":"2025-11-25T15:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.595674 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.596369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.596422 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.596451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.596468 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:56Z","lastTransitionTime":"2025-11-25T15:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.699123 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.699180 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.699196 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.699222 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.699239 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:56Z","lastTransitionTime":"2025-11-25T15:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.774864 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.774957 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:56 crc kubenswrapper[4743]: E1125 15:59:56.775141 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:56 crc kubenswrapper[4743]: E1125 15:59:56.775266 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.802639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.802697 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.802709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.802732 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.802747 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:56Z","lastTransitionTime":"2025-11-25T15:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.906167 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.906216 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.906228 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.906249 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:56 crc kubenswrapper[4743]: I1125 15:59:56.906264 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:56Z","lastTransitionTime":"2025-11-25T15:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.010153 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.010243 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.010259 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.010284 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.010303 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:57Z","lastTransitionTime":"2025-11-25T15:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.113669 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.113732 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.113747 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.113770 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.113790 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:57Z","lastTransitionTime":"2025-11-25T15:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.217388 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.217447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.217460 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.217483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.217510 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:57Z","lastTransitionTime":"2025-11-25T15:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.320685 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.320754 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.320778 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.320805 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.320827 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:57Z","lastTransitionTime":"2025-11-25T15:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.428640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.428844 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.428883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.428917 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.428938 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:57Z","lastTransitionTime":"2025-11-25T15:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.533977 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.534064 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.534083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.534113 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.534134 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:57Z","lastTransitionTime":"2025-11-25T15:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.637902 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.637953 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.637970 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.637989 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.638003 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:57Z","lastTransitionTime":"2025-11-25T15:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.740588 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.740669 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.740688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.740715 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.740734 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:57Z","lastTransitionTime":"2025-11-25T15:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.774418 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.774418 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:57 crc kubenswrapper[4743]: E1125 15:59:57.774626 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:57 crc kubenswrapper[4743]: E1125 15:59:57.774740 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.843817 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.843874 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.843891 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.843914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.843930 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:57Z","lastTransitionTime":"2025-11-25T15:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.947475 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.947546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.947567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.947636 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:57 crc kubenswrapper[4743]: I1125 15:59:57.947665 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:57Z","lastTransitionTime":"2025-11-25T15:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.050705 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.050782 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.050798 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.050821 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.050839 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:58Z","lastTransitionTime":"2025-11-25T15:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.154160 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.154214 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.154227 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.154250 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.154266 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:58Z","lastTransitionTime":"2025-11-25T15:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.257269 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.257341 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.257363 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.257384 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.257402 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:58Z","lastTransitionTime":"2025-11-25T15:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.361282 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.361354 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.361427 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.361456 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.361507 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:58Z","lastTransitionTime":"2025-11-25T15:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.464132 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.464187 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.464201 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.464222 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.464240 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:58Z","lastTransitionTime":"2025-11-25T15:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.568453 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.568527 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.568545 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.568583 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.568643 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:58Z","lastTransitionTime":"2025-11-25T15:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.672323 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.672383 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.672400 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.672424 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.672443 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:58Z","lastTransitionTime":"2025-11-25T15:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.676099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.676169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.676188 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.676215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.676234 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:58Z","lastTransitionTime":"2025-11-25T15:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:58 crc kubenswrapper[4743]: E1125 15:59:58.697395 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:58Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.704226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.704326 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.704341 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.704387 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.704401 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:58Z","lastTransitionTime":"2025-11-25T15:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:58 crc kubenswrapper[4743]: E1125 15:59:58.722183 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:58Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.727833 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.727894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.727909 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.727937 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.727954 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:58Z","lastTransitionTime":"2025-11-25T15:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:58 crc kubenswrapper[4743]: E1125 15:59:58.748169 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:58Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.754266 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.754348 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.754375 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.754511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.754534 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:58Z","lastTransitionTime":"2025-11-25T15:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.775019 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.775100 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:59:58 crc kubenswrapper[4743]: E1125 15:59:58.775192 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 15:59:58 crc kubenswrapper[4743]: E1125 15:59:58.775396 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:59:58 crc kubenswrapper[4743]: E1125 15:59:58.778054 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:58Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.784176 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.784243 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.784261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.784283 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.784296 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:58Z","lastTransitionTime":"2025-11-25T15:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:58 crc kubenswrapper[4743]: E1125 15:59:58.802012 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:59:58Z is after 2025-08-24T17:21:41Z" Nov 25 15:59:58 crc kubenswrapper[4743]: E1125 15:59:58.802248 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.804816 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.804889 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.804907 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.804930 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.804944 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:58Z","lastTransitionTime":"2025-11-25T15:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.908220 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.908290 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.908312 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.908338 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:58 crc kubenswrapper[4743]: I1125 15:59:58.908354 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:58Z","lastTransitionTime":"2025-11-25T15:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.011823 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.011869 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.011878 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.011894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.011911 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:59Z","lastTransitionTime":"2025-11-25T15:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.115105 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.115170 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.115179 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.115194 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.115205 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:59Z","lastTransitionTime":"2025-11-25T15:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.217912 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.217983 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.217994 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.218010 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.218024 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:59Z","lastTransitionTime":"2025-11-25T15:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.321503 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.321561 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.321576 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.321626 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.321642 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:59Z","lastTransitionTime":"2025-11-25T15:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.424650 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.424729 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.424745 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.424767 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.424782 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:59Z","lastTransitionTime":"2025-11-25T15:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.527907 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.527962 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.527973 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.527998 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.528014 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:59Z","lastTransitionTime":"2025-11-25T15:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.631050 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.631104 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.631115 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.631138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.631149 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:59Z","lastTransitionTime":"2025-11-25T15:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.734128 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.734176 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.734187 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.734205 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.734215 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:59Z","lastTransitionTime":"2025-11-25T15:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.774900 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:59:59 crc kubenswrapper[4743]: E1125 15:59:59.775077 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.775204 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:59:59 crc kubenswrapper[4743]: E1125 15:59:59.775841 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.776225 4743 scope.go:117] "RemoveContainer" containerID="c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.836332 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.836368 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.836378 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.836396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.836412 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:59Z","lastTransitionTime":"2025-11-25T15:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.939189 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.939225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.939255 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.939273 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:59:59 crc kubenswrapper[4743]: I1125 15:59:59.939286 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:59:59Z","lastTransitionTime":"2025-11-25T15:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.041891 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.041931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.041941 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.041959 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.041969 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:00Z","lastTransitionTime":"2025-11-25T16:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.145099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.145159 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.145173 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.145194 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.145211 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:00Z","lastTransitionTime":"2025-11-25T16:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.200444 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovnkube-controller/2.log" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.204082 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerStarted","Data":"30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3"} Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.204615 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.218685 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6629b5da-56a6-4b21-bed8-c6f5ee333837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dbe4f4383f97c10cebc1610dcb8cfada03fef270e471640d8efdfadeed821c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a128eb053d966105d31ed7d9af2d2306eb60abd8f0aa8924e45156535d5a7ada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136e156bb873571a110dbc5090cfcbd1c0eeec8af8f39bb8aea7f0bb369b6389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.231338 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.245473 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.247462 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.247502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.247513 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.247531 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.247543 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:00Z","lastTransitionTime":"2025-11-25T16:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.258140 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.271940 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.285775 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.299406 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47d4adf248256da18201eea949e15ec4471560028e06db3be072d6325667fc19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:52Z\\\",\\\"message\\\":\\\"2025-11-25T15:59:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_732b0472-8785-498f-ae94-a1cd81379159\\\\n2025-11-25T15:59:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_732b0472-8785-498f-ae94-a1cd81379159 to /host/opt/cni/bin/\\\\n2025-11-25T15:59:07Z [verbose] multus-daemon started\\\\n2025-11-25T15:59:07Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:59:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.316418 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:59:31.500475 6396 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 15:59:31.500500 6396 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 15:59:31.500535 6396 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:59:31.500570 6396 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:59:31.500623 6396 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:59:31.500631 6396 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:59:31.500649 6396 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:59:31.500666 6396 factory.go:656] Stopping watch factory\\\\nI1125 15:59:31.500684 6396 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 15:59:31.500695 6396 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:59:31.500702 6396 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:59:31.500708 6396 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:59:31.500717 6396 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:59:31.500724 6396 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:59:31.500731 6396 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.334344 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.345069 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.349870 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.349917 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.349929 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.349948 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.349959 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:00Z","lastTransitionTime":"2025-11-25T16:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.358244 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.373983 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.384976 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.396571 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.407786 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.421959 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.435959 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.446959 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:00Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.453320 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.453567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.453723 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.453814 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.453883 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:00Z","lastTransitionTime":"2025-11-25T16:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.557301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.558035 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.558091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.558120 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.558137 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:00Z","lastTransitionTime":"2025-11-25T16:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.661503 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.661569 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.661582 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.661619 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.661631 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:00Z","lastTransitionTime":"2025-11-25T16:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.765021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.765113 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.765131 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.765162 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.765181 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:00Z","lastTransitionTime":"2025-11-25T16:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.774326 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.774442 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:00 crc kubenswrapper[4743]: E1125 16:00:00.774665 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:00 crc kubenswrapper[4743]: E1125 16:00:00.774845 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.787122 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.868267 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.868336 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.868353 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.868380 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.868412 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:00Z","lastTransitionTime":"2025-11-25T16:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.971869 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.971915 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.971930 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.971955 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:00 crc kubenswrapper[4743]: I1125 16:00:00.971972 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:00Z","lastTransitionTime":"2025-11-25T16:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.075124 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.075196 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.075222 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.075251 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.075269 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:01Z","lastTransitionTime":"2025-11-25T16:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.178306 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.178357 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.178367 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.178388 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.178400 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:01Z","lastTransitionTime":"2025-11-25T16:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.210645 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovnkube-controller/3.log" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.211246 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovnkube-controller/2.log" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.215207 4743 generic.go:334] "Generic (PLEG): container finished" podID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerID="30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3" exitCode=1 Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.215242 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerDied","Data":"30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3"} Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.215311 4743 scope.go:117] "RemoveContainer" containerID="c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.216383 4743 scope.go:117] "RemoveContainer" containerID="30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3" Nov 25 16:00:01 crc kubenswrapper[4743]: E1125 16:00:01.216764 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.232520 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.245250 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.260283 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.276044 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.281619 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.281654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.281665 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.281678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.281689 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:01Z","lastTransitionTime":"2025-11-25T16:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.295682 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47d4adf248256da18201eea949e15ec4471560028e06db3be072d6325667fc19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:52Z\\\",\\\"message\\\":\\\"2025-11-25T15:59:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_732b0472-8785-498f-ae94-a1cd81379159\\\\n2025-11-25T15:59:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_732b0472-8785-498f-ae94-a1cd81379159 to /host/opt/cni/bin/\\\\n2025-11-25T15:59:07Z [verbose] multus-daemon started\\\\n2025-11-25T15:59:07Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:59:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.316386 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:59:31.500475 6396 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 15:59:31.500500 6396 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 15:59:31.500535 6396 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:59:31.500570 6396 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:59:31.500623 6396 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:59:31.500631 6396 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:59:31.500649 6396 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:59:31.500666 6396 factory.go:656] Stopping watch factory\\\\nI1125 15:59:31.500684 6396 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 15:59:31.500695 6396 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:59:31.500702 6396 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:59:31.500708 6396 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:59:31.500717 6396 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:59:31.500724 6396 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:59:31.500731 6396 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T16:00:00Z\\\",\\\"message\\\":\\\" 16:00:00.510138 6748 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 16:00:00.510183 6748 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 16:00:00.510214 6748 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 16:00:00.510079 6748 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1125 16:00:00.510248 6748 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1125 16:00:00.510287 6748 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.341908 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.358584 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.373145 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.384345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.384392 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.384406 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.384429 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.384445 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:01Z","lastTransitionTime":"2025-11-25T16:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.386970 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.402319 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.413914 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.427443 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.449472 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.467175 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.480351 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.487297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.487345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.487360 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.487379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.487392 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:01Z","lastTransitionTime":"2025-11-25T16:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.492855 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f55ff69c-c9b5-40dd-9d4a-57f528cfd71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cbb3a6ce8f0ce1dad29b21a3a2f0c057af3d6fe96c024fbb97b4eaa7df4b4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c41a1e21cfb871bbbfef7b006ede93c0375bd44e2b84ba274b0a5f9a167abe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95c41a1e21cfb871bbbfef7b006ede93c0375bd44e2b84ba274b0a5f9a167abe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.506326 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6629b5da-56a6-4b21-bed8-c6f5ee333837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dbe4f4383f97c10cebc1610dcb8cfada03fef270e471640d8efdfadeed821c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a128eb053d966105d31ed7d9af2d2306eb60abd8f0aa8924e45156535d5a7ada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136e156bb873571a110dbc5090cfcbd1c0eeec8af8f39bb8aea7f0bb369b6389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.519167 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.590856 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.590920 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.590932 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.590954 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.590968 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:01Z","lastTransitionTime":"2025-11-25T16:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.693629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.693675 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.693684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.693700 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.693716 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:01Z","lastTransitionTime":"2025-11-25T16:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.774869 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.774926 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:01 crc kubenswrapper[4743]: E1125 16:00:01.775035 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:01 crc kubenswrapper[4743]: E1125 16:00:01.775165 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.795493 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.796213 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.796250 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.796263 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.796281 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.796293 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:01Z","lastTransitionTime":"2025-11-25T16:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.808664 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.822695 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.842369 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c86ca78d43e70b4374592ebf7a998feb02637e45bb8ad057b66504c66915237a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:31Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1125 15:59:31.500475 6396 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 15:59:31.500500 6396 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 15:59:31.500535 6396 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:59:31.500570 6396 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:59:31.500623 6396 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:59:31.500631 6396 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:59:31.500649 6396 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 15:59:31.500666 6396 factory.go:656] Stopping watch factory\\\\nI1125 15:59:31.500684 6396 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 15:59:31.500695 6396 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 15:59:31.500702 6396 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 15:59:31.500708 6396 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:59:31.500717 6396 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:59:31.500724 6396 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 15:59:31.500731 6396 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T16:00:00Z\\\",\\\"message\\\":\\\" 16:00:00.510138 6748 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 16:00:00.510183 6748 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 16:00:00.510214 6748 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 16:00:00.510079 6748 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1125 16:00:00.510248 6748 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1125 16:00:00.510287 6748 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.856345 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.870255 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.884787 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.899180 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.899226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.899236 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.899259 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.899272 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:01Z","lastTransitionTime":"2025-11-25T16:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.909335 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.924206 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.934040 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.947617 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.959194 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f55ff69c-c9b5-40dd-9d4a-57f528cfd71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cbb3a6ce8f0ce1dad29b21a3a2f0c057af3d6fe96c024fbb97b4eaa7df4b4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c41a1e21cfb871bbbfef7b006ede93c0375bd44e2b84ba274b0a5f9a167abe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95c41a1e21cfb871bbbfef7b006ede93c0375bd44e2b84ba274b0a5f9a167abe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.974445 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6629b5da-56a6-4b21-bed8-c6f5ee333837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dbe4f4383f97c10cebc1610dcb8cfada03fef270e471640d8efdfadeed821c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a128eb053d966105d31ed7d9af2d2306eb60abd8f0aa8924e45156535d5a7ada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136e156bb873571a110dbc5090cfcbd1c0eeec8af8f39bb8aea7f0bb369b6389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:01 crc kubenswrapper[4743]: I1125 16:00:01.989354 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:01Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.002583 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.002668 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.002679 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.002700 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.002710 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:02Z","lastTransitionTime":"2025-11-25T16:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.004123 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.017218 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.036078 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47d4adf248256da18201eea949e15ec4471560028e06db3be072d6325667fc19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:52Z\\\",\\\"message\\\":\\\"2025-11-25T15:59:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_732b0472-8785-498f-ae94-a1cd81379159\\\\n2025-11-25T15:59:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_732b0472-8785-498f-ae94-a1cd81379159 to /host/opt/cni/bin/\\\\n2025-11-25T15:59:07Z [verbose] multus-daemon started\\\\n2025-11-25T15:59:07Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:59:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.052928 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.066206 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.105498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.105546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.105558 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.105582 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.105620 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:02Z","lastTransitionTime":"2025-11-25T16:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.208662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.208997 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.209008 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.209024 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.209035 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:02Z","lastTransitionTime":"2025-11-25T16:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.221049 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovnkube-controller/3.log" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.225963 4743 scope.go:117] "RemoveContainer" containerID="30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3" Nov 25 16:00:02 crc kubenswrapper[4743]: E1125 16:00:02.226273 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.241787 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f55ff69c-c9b5-40dd-9d4a-57f528cfd71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cbb3a6ce8f0ce1dad29b21a3a2f0c057af3d6fe96c024fbb97b4eaa7df4b4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c41a1e21cfb871bbbfef7b006ede93c0375bd44e2b84ba274b0a5f9a167abe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95c41a1e21cfb871bbbfef7b006ede93c0375bd44e2b84ba274b0a5f9a167abe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.260018 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6629b5da-56a6-4b21-bed8-c6f5ee333837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dbe4f4383f97c10cebc1610dcb8cfada03fef270e471640d8efdfadeed821c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a128eb053d966105d31ed7d9af2d2306eb60abd8f0aa8924e45156535d5a7ada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136e156bb873571a110dbc5090cfcbd1c0eeec8af8f39bb8aea7f0bb369b6389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.279243 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.307009 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.311582 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.311649 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.311659 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.311678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.311688 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:02Z","lastTransitionTime":"2025-11-25T16:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.323903 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.342166 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47d4adf248256da18201eea949e15ec4471560028e06db3be072d6325667fc19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:52Z\\\",\\\"message\\\":\\\"2025-11-25T15:59:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_732b0472-8785-498f-ae94-a1cd81379159\\\\n2025-11-25T15:59:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_732b0472-8785-498f-ae94-a1cd81379159 to /host/opt/cni/bin/\\\\n2025-11-25T15:59:07Z [verbose] multus-daemon started\\\\n2025-11-25T15:59:07Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:59:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.366803 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.381740 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.404377 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.414857 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.414905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.414918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.414938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.414950 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:02Z","lastTransitionTime":"2025-11-25T16:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.417343 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.431105 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.452824 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T16:00:00Z\\\",\\\"message\\\":\\\" 16:00:00.510138 6748 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 16:00:00.510183 6748 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 16:00:00.510214 6748 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 16:00:00.510079 6748 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1125 16:00:00.510248 6748 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1125 16:00:00.510287 6748 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.467072 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.480578 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.493023 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.507990 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.518716 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.518775 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.518790 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.518814 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.518828 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:02Z","lastTransitionTime":"2025-11-25T16:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.522194 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.534998 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.545370 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:02Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.621415 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.621468 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.621484 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.621504 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.621517 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:02Z","lastTransitionTime":"2025-11-25T16:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.725132 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.725229 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.725246 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.725268 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.725295 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:02Z","lastTransitionTime":"2025-11-25T16:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.774086 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:02 crc kubenswrapper[4743]: E1125 16:00:02.774291 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.774486 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:02 crc kubenswrapper[4743]: E1125 16:00:02.774844 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.828120 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.828175 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.828192 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.828215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.828227 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:02Z","lastTransitionTime":"2025-11-25T16:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.931503 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.931570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.931579 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.931627 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:02 crc kubenswrapper[4743]: I1125 16:00:02.931640 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:02Z","lastTransitionTime":"2025-11-25T16:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.035165 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.035232 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.035253 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.035275 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.035289 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:03Z","lastTransitionTime":"2025-11-25T16:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.138341 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.138403 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.138418 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.138440 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.138455 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:03Z","lastTransitionTime":"2025-11-25T16:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.241658 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.241720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.241733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.241753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.241765 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:03Z","lastTransitionTime":"2025-11-25T16:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.344888 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.344943 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.344953 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.344972 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.344986 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:03Z","lastTransitionTime":"2025-11-25T16:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.448099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.448184 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.448301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.448346 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.448373 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:03Z","lastTransitionTime":"2025-11-25T16:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.551736 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.551820 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.551835 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.551858 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.551872 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:03Z","lastTransitionTime":"2025-11-25T16:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.654828 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.654892 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.654903 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.654930 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.654943 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:03Z","lastTransitionTime":"2025-11-25T16:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.758441 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.758526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.758546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.758577 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.758631 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:03Z","lastTransitionTime":"2025-11-25T16:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.774370 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.774429 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:03 crc kubenswrapper[4743]: E1125 16:00:03.774655 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:03 crc kubenswrapper[4743]: E1125 16:00:03.774779 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.861967 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.862080 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.862105 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.862142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.862167 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:03Z","lastTransitionTime":"2025-11-25T16:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.965462 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.965520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.965544 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.965563 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:03 crc kubenswrapper[4743]: I1125 16:00:03.965576 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:03Z","lastTransitionTime":"2025-11-25T16:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.068891 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.069352 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.069450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.069559 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.069696 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:04Z","lastTransitionTime":"2025-11-25T16:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.172643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.172980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.173080 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.173177 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.173267 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:04Z","lastTransitionTime":"2025-11-25T16:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.276297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.276338 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.276349 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.276384 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.276398 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:04Z","lastTransitionTime":"2025-11-25T16:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.379713 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.379775 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.379786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.379846 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.379859 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:04Z","lastTransitionTime":"2025-11-25T16:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.483123 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.483165 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.483180 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.483209 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.483223 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:04Z","lastTransitionTime":"2025-11-25T16:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.569354 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:00:04 crc kubenswrapper[4743]: E1125 16:00:04.569691 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.569642925 +0000 UTC m=+147.691482494 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.585842 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.585882 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.585897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.585918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.585935 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:04Z","lastTransitionTime":"2025-11-25T16:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.671423 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.671487 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.671519 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.671567 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:04 crc kubenswrapper[4743]: E1125 16:00:04.671771 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 16:00:04 crc kubenswrapper[4743]: E1125 16:00:04.671807 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 16:00:04 crc kubenswrapper[4743]: E1125 16:00:04.671771 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 16:00:04 crc kubenswrapper[4743]: E1125 16:00:04.671825 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 16:00:04 crc kubenswrapper[4743]: E1125 16:00:04.671838 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 16:00:04 crc kubenswrapper[4743]: E1125 16:00:04.671832 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 16:00:04 crc kubenswrapper[4743]: E1125 16:00:04.671897 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.67187374 +0000 UTC m=+147.793713299 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 16:00:04 crc kubenswrapper[4743]: E1125 16:00:04.671851 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 16:00:04 crc kubenswrapper[4743]: E1125 16:00:04.671975 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.671934692 +0000 UTC m=+147.793774271 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 16:00:04 crc kubenswrapper[4743]: E1125 16:00:04.672008 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.671994374 +0000 UTC m=+147.793833953 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 16:00:04 crc kubenswrapper[4743]: E1125 16:00:04.672064 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 16:00:04 crc kubenswrapper[4743]: E1125 16:00:04.672235 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.672135398 +0000 UTC m=+147.793974977 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.689647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.689700 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.689717 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.689742 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.689760 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:04Z","lastTransitionTime":"2025-11-25T16:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.774623 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:04 crc kubenswrapper[4743]: E1125 16:00:04.774822 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.774992 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:04 crc kubenswrapper[4743]: E1125 16:00:04.775238 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.793577 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.793709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.793739 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.793770 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.793793 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:04Z","lastTransitionTime":"2025-11-25T16:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.896570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.896633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.896682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.896719 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:04 crc kubenswrapper[4743]: I1125 16:00:04.896737 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:04Z","lastTransitionTime":"2025-11-25T16:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.000199 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.000277 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.000293 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.000348 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.000371 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:05Z","lastTransitionTime":"2025-11-25T16:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.104162 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.104578 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.104762 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.104923 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.105026 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:05Z","lastTransitionTime":"2025-11-25T16:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.207630 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.208005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.208112 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.208222 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.208312 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:05Z","lastTransitionTime":"2025-11-25T16:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.312004 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.312347 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.312436 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.312555 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.312696 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:05Z","lastTransitionTime":"2025-11-25T16:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.415677 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.416046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.416141 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.416213 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.416285 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:05Z","lastTransitionTime":"2025-11-25T16:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.520120 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.520415 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.520627 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.520755 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.520856 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:05Z","lastTransitionTime":"2025-11-25T16:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.624934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.624986 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.624999 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.625021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.625037 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:05Z","lastTransitionTime":"2025-11-25T16:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.730142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.730798 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.730878 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.730961 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.731039 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:05Z","lastTransitionTime":"2025-11-25T16:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.774393 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:05 crc kubenswrapper[4743]: E1125 16:00:05.774626 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.774890 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:05 crc kubenswrapper[4743]: E1125 16:00:05.775198 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.834487 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.834536 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.834548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.834571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.834602 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:05Z","lastTransitionTime":"2025-11-25T16:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.937873 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.937940 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.937955 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.937977 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:05 crc kubenswrapper[4743]: I1125 16:00:05.937991 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:05Z","lastTransitionTime":"2025-11-25T16:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.041040 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.041086 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.041095 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.041112 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.041125 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:06Z","lastTransitionTime":"2025-11-25T16:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.143665 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.143721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.143733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.143751 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.143764 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:06Z","lastTransitionTime":"2025-11-25T16:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.246446 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.246513 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.246526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.246553 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.246568 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:06Z","lastTransitionTime":"2025-11-25T16:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.349179 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.349234 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.349245 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.349263 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.349276 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:06Z","lastTransitionTime":"2025-11-25T16:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.453028 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.453111 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.453136 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.453165 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.453186 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:06Z","lastTransitionTime":"2025-11-25T16:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.556510 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.556559 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.556571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.556614 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.556626 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:06Z","lastTransitionTime":"2025-11-25T16:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.659261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.659318 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.659330 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.659348 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.659360 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:06Z","lastTransitionTime":"2025-11-25T16:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.762320 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.762389 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.762445 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.762471 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.762486 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:06Z","lastTransitionTime":"2025-11-25T16:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.775052 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.775118 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:06 crc kubenswrapper[4743]: E1125 16:00:06.775210 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:06 crc kubenswrapper[4743]: E1125 16:00:06.775763 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.865948 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.866257 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.866359 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.866455 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.866534 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:06Z","lastTransitionTime":"2025-11-25T16:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.969873 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.970204 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.970283 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.970366 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:06 crc kubenswrapper[4743]: I1125 16:00:06.970447 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:06Z","lastTransitionTime":"2025-11-25T16:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.073549 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.073630 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.073676 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.073698 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.073709 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:07Z","lastTransitionTime":"2025-11-25T16:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.177557 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.177617 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.177627 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.177643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.177654 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:07Z","lastTransitionTime":"2025-11-25T16:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.280925 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.280971 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.280988 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.281011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.281029 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:07Z","lastTransitionTime":"2025-11-25T16:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.384270 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.384316 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.384326 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.384345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.384379 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:07Z","lastTransitionTime":"2025-11-25T16:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.487789 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.487839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.487848 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.487865 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.487879 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:07Z","lastTransitionTime":"2025-11-25T16:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.590834 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.590883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.590897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.590917 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.590931 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:07Z","lastTransitionTime":"2025-11-25T16:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.694060 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.694155 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.694181 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.694211 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.694234 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:07Z","lastTransitionTime":"2025-11-25T16:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.774090 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.774198 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:07 crc kubenswrapper[4743]: E1125 16:00:07.774981 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:07 crc kubenswrapper[4743]: E1125 16:00:07.775212 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.797067 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.797163 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.797187 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.797217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.797237 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:07Z","lastTransitionTime":"2025-11-25T16:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.900666 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.900716 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.900732 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.900755 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:07 crc kubenswrapper[4743]: I1125 16:00:07.900769 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:07Z","lastTransitionTime":"2025-11-25T16:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.003948 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.004023 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.004037 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.004056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.004069 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:08Z","lastTransitionTime":"2025-11-25T16:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.107342 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.107395 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.107408 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.107429 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.107444 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:08Z","lastTransitionTime":"2025-11-25T16:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.210469 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.210528 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.210557 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.210582 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.210640 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:08Z","lastTransitionTime":"2025-11-25T16:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.313445 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.313497 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.313511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.313532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.313545 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:08Z","lastTransitionTime":"2025-11-25T16:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.416643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.416684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.416695 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.416713 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.416727 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:08Z","lastTransitionTime":"2025-11-25T16:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.519650 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.519703 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.519714 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.519730 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.519742 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:08Z","lastTransitionTime":"2025-11-25T16:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.622222 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.622259 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.622267 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.622282 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.622293 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:08Z","lastTransitionTime":"2025-11-25T16:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.724546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.724613 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.724627 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.724647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.724662 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:08Z","lastTransitionTime":"2025-11-25T16:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.774452 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.774584 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:08 crc kubenswrapper[4743]: E1125 16:00:08.774788 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:08 crc kubenswrapper[4743]: E1125 16:00:08.774988 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.827968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.828011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.828023 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.828041 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.828053 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:08Z","lastTransitionTime":"2025-11-25T16:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.930838 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.930910 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.930928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.930954 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.930971 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:08Z","lastTransitionTime":"2025-11-25T16:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.992767 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.992809 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.992818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.992832 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:08 crc kubenswrapper[4743]: I1125 16:00:08.992844 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:08Z","lastTransitionTime":"2025-11-25T16:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:09 crc kubenswrapper[4743]: E1125 16:00:09.004300 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:09Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.008194 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.008233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.008243 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.008259 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.008269 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:09Z","lastTransitionTime":"2025-11-25T16:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:09 crc kubenswrapper[4743]: E1125 16:00:09.019385 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:09Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.022691 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.022725 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.022738 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.022760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.022772 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:09Z","lastTransitionTime":"2025-11-25T16:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:09 crc kubenswrapper[4743]: E1125 16:00:09.035451 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:09Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.039223 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.039255 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.039265 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.039280 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.039291 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:09Z","lastTransitionTime":"2025-11-25T16:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:09 crc kubenswrapper[4743]: E1125 16:00:09.051114 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:09Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.055170 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.055215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.055225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.055240 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.055251 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:09Z","lastTransitionTime":"2025-11-25T16:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:09 crc kubenswrapper[4743]: E1125 16:00:09.067871 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T16:00:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c508e6b4-2850-452f-b81a-6a39b638eedc\\\",\\\"systemUUID\\\":\\\"b6bba882-710f-4262-b836-cf27dad9acbb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:09Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:09 crc kubenswrapper[4743]: E1125 16:00:09.067988 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.069699 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.069753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.069766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.069788 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.069800 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:09Z","lastTransitionTime":"2025-11-25T16:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.172720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.172764 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.172777 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.172798 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.172810 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:09Z","lastTransitionTime":"2025-11-25T16:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.275425 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.275469 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.275479 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.275498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.275511 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:09Z","lastTransitionTime":"2025-11-25T16:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.378516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.378571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.378583 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.378625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.378640 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:09Z","lastTransitionTime":"2025-11-25T16:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.481106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.481543 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.481555 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.481633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.481653 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:09Z","lastTransitionTime":"2025-11-25T16:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.584943 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.584991 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.585001 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.585020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.585036 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:09Z","lastTransitionTime":"2025-11-25T16:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.688183 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.688238 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.688247 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.688269 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.688282 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:09Z","lastTransitionTime":"2025-11-25T16:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.774952 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:09 crc kubenswrapper[4743]: E1125 16:00:09.775156 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.775453 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:09 crc kubenswrapper[4743]: E1125 16:00:09.775657 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.791004 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.791073 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.791088 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.791109 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.791126 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:09Z","lastTransitionTime":"2025-11-25T16:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.893606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.893663 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.893676 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.893700 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.893716 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:09Z","lastTransitionTime":"2025-11-25T16:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.997301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.997377 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.997393 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.997418 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:09 crc kubenswrapper[4743]: I1125 16:00:09.997433 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:09Z","lastTransitionTime":"2025-11-25T16:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.100324 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.100381 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.100396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.100419 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.100433 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:10Z","lastTransitionTime":"2025-11-25T16:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.203633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.203716 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.203732 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.203755 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.203771 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:10Z","lastTransitionTime":"2025-11-25T16:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.307233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.307329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.307343 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.307362 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.307375 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:10Z","lastTransitionTime":"2025-11-25T16:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.410230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.410288 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.410300 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.410320 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.410336 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:10Z","lastTransitionTime":"2025-11-25T16:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.513128 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.513211 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.513226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.513272 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.513291 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:10Z","lastTransitionTime":"2025-11-25T16:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.616017 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.616088 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.616104 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.616145 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.616160 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:10Z","lastTransitionTime":"2025-11-25T16:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.719511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.719560 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.719571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.719609 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.719626 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:10Z","lastTransitionTime":"2025-11-25T16:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.774321 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.774382 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:10 crc kubenswrapper[4743]: E1125 16:00:10.774490 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:10 crc kubenswrapper[4743]: E1125 16:00:10.774579 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.822886 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.823128 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.823141 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.823167 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.823185 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:10Z","lastTransitionTime":"2025-11-25T16:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.925916 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.925979 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.925993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.926017 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:10 crc kubenswrapper[4743]: I1125 16:00:10.926031 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:10Z","lastTransitionTime":"2025-11-25T16:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.029152 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.029211 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.029219 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.029234 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.029245 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:11Z","lastTransitionTime":"2025-11-25T16:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.131971 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.132010 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.132022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.132038 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.132051 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:11Z","lastTransitionTime":"2025-11-25T16:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.235905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.235969 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.235982 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.236004 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.236021 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:11Z","lastTransitionTime":"2025-11-25T16:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.339404 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.339475 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.339498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.339526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.339547 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:11Z","lastTransitionTime":"2025-11-25T16:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.442501 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.442540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.442549 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.442567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.442578 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:11Z","lastTransitionTime":"2025-11-25T16:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.546015 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.546055 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.546064 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.546079 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.546090 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:11Z","lastTransitionTime":"2025-11-25T16:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.649575 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.649649 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.649662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.649683 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.649699 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:11Z","lastTransitionTime":"2025-11-25T16:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.753379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.753448 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.753472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.753505 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.753536 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:11Z","lastTransitionTime":"2025-11-25T16:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.775042 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.775128 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:11 crc kubenswrapper[4743]: E1125 16:00:11.775325 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:11 crc kubenswrapper[4743]: E1125 16:00:11.775483 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.790999 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdedf2ea-d41d-4ac8-b52e-9544349b5397\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dfe12d7776235d1b0963eec372670fd06f5841d3043d19f67fbb451c094fb48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://026080959c785da824e070de77da0c26825b171b32d9504a31db69aaa55090a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4676e38dd64958dad5a059f86876bd0926a05c5929fbf567bc2eedd2f42c9a12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.801354 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6xggw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78bac9e6-44dd-4270-9183-774823a568a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ede0bef771bcbc2143c6dc2e98ce7954cdade63b4ce76d5bf21e6735c85979\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htr6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6xggw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.814152 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n2r2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2175b34c-5202-4e94-af0e-2f879b98c0bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47d4adf248256da18201eea949e15ec4471560028e06db3be072d6325667fc19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:59:52Z\\\",\\\"message\\\":\\\"2025-11-25T15:59:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_732b0472-8785-498f-ae94-a1cd81379159\\\\n2025-11-25T15:59:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_732b0472-8785-498f-ae94-a1cd81379159 to /host/opt/cni/bin/\\\\n2025-11-25T15:59:07Z [verbose] multus-daemon started\\\\n2025-11-25T15:59:07Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:59:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-858zh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n2r2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.828031 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2d6248c-be7e-48f3-b314-6089c361b67a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9a85f73bdc6425ed7860166190b2d94530e765a2987938a4347b9861113359b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeb82b9700706b19a5c8bde0de5c1809e5c0c97e2c595c59bb3d5fdcbf4868ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d74d79bc2dd8101dc2b0990d437b9f5e3b6ed4d52b357893f2ffbdc12ae9200d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4684115393b94ed37a5d49fa344f9f209065e2e126cf516c617cb8f87aa565f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd7e092257bf85f28e62902ce3b23a57eaa71337065404c0ee6080045ee2e77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbc197a23643191cdc9b8c441d2ac90a268cc37a7c71b9c16136bcb530d6715b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://789c6332d831bff0aefea0fe05e0611f29870e277fac6873bdd0e14ae23c78e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqwtx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntxtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.838467 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8169051d-4e2b-48e2-96e1-c113cadf2d76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7fafb22f65383a42e4e0fbfb761a25f5b9e2756a0efa6018694929b1d89bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://912dc079e1d775b0cf0f4f79bf0c205a810fcd5902bbc569a7a50efeb8f9d9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjzf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fhb8n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.855843 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.855876 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.855886 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.855905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.855917 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:11Z","lastTransitionTime":"2025-11-25T16:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.857575 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0834a9-e7c9-4a76-a769-d4d3509d279e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6650d9a20826b192919130d0b499f08bdbc37c51055e79ea9a2dde9f7fa45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7215d48912ff899ea742d5a6518665cf5a6cd85e83cd6365080ed7fbdd68965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaaef4aaf2cc23bfeee032ccf975016717717b1bc78fac3088d236f25056318e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fea61df5f63065475dcdc2cd7bc5bf944d278e2edd4622db48ee042a70d4610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7eb7be9fde699c0491f392f1b90068c04fd7091872cd8fa0b78c879ca757a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba7b1de14258c4c40fb9945a5b62572b0885723f9544a48b99f21e2dd6630af1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://999df38149724ed6a5b79ebd4b454784acd011793129a5b036a63dda63a0c802\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c143b64790a0eee7690bbf0ea29a7d15be96520d9b440106cea579f5e6dd1b98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.869113 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.882253 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4abbca58ee143cb2e47b6d2755dc25cc367be8101e36fad66cc7d3d18abdbac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c5d5e1f166d93d54984d1536bc8fd97e5671101458257eb5f5326b205309b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.905291 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d04400c3-4f05-4be2-b759-a60cec0746ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T16:00:00Z\\\",\\\"message\\\":\\\" 16:00:00.510138 6748 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 16:00:00.510183 6748 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 16:00:00.510214 6748 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 16:00:00.510079 6748 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1125 16:00:00.510248 6748 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1125 16:00:00.510287 6748 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbbps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbbjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.918652 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7ccb4d3-6e49-456e-82c1-923ce6c11d69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1125 15:59:00.337965 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1125 15:59:00.337984 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1125 15:59:00.338044 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1125 15:59:00.338053 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764086324\\\\\\\\\\\\\\\" (2025-11-25 15:58:44 +0000 UTC to 2025-12-25 15:58:45 +0000 UTC (now=2025-11-25 15:59:00.338017609 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338090 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4041511136/tls.crt::/tmp/serving-cert-4041511136/tls.key\\\\\\\"\\\\nI1125 15:59:00.338178 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764086325\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764086325\\\\\\\\\\\\\\\" (2025-11-25 14:58:45 +0000 UTC to 2026-11-25 14:58:45 +0000 UTC (now=2025-11-25 15:59:00.338159664 +0000 UTC))\\\\\\\"\\\\nI1125 15:59:00.338200 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1125 15:59:00.338223 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1125 15:59:00.338064 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1125 15:59:00.338423 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.932940 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7aa192fba5a24e2411cd0ff69e3b0d3690fc9e03e1f9d3b10eaad57801a7fdce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.946773 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.959209 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.959260 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.959273 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.959292 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.959304 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:11Z","lastTransitionTime":"2025-11-25T16:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.959796 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7def686dadd541aa1b38ddfa4409f8438daf5115adfcf4b866fc99f8efcf6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.969831 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73c29847-f70f-4ab1-9691-685966384446\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915bfccb9bdd882a8dcb6dc75a7acb3cddff54b512d9818a353fd7de3c22673b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pzb4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f7q7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.979923 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zxxwm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e69c3c02-668d-42ba-9347-e5bea6cdf260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace38022251952c6c9f8dc1947e1263a6450073a3ae0f59b48d428abb4f7a37f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:59:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x5jz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zxxwm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.989024 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s9t79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"617512f9-f767-4615-a9d2-132c6c73a69d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd9cz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:59:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s9t79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:11 crc kubenswrapper[4743]: I1125 16:00:11.999181 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f55ff69c-c9b5-40dd-9d4a-57f528cfd71f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cbb3a6ce8f0ce1dad29b21a3a2f0c057af3d6fe96c024fbb97b4eaa7df4b4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95c41a1e21cfb871bbbfef7b006ede93c0375bd44e2b84ba274b0a5f9a167abe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95c41a1e21cfb871bbbfef7b006ede93c0375bd44e2b84ba274b0a5f9a167abe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:11Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.010356 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6629b5da-56a6-4b21-bed8-c6f5ee333837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dbe4f4383f97c10cebc1610dcb8cfada03fef270e471640d8efdfadeed821c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a128eb053d966105d31ed7d9af2d2306eb60abd8f0aa8924e45156535d5a7ada\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136e156bb873571a110dbc5090cfcbd1c0eeec8af8f39bb8aea7f0bb369b6389\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27985ba82388d79ac95165ec776299662fd93080742f7b6aba70cd1f462b71e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:12Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.022199 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T16:00:12Z is after 2025-08-24T17:21:41Z" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.062144 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.062200 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.062215 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.062235 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.062248 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:12Z","lastTransitionTime":"2025-11-25T16:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.165616 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.165673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.165689 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.165714 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.165729 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:12Z","lastTransitionTime":"2025-11-25T16:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.268998 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.269057 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.269070 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.269091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.269104 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:12Z","lastTransitionTime":"2025-11-25T16:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.372262 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.372319 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.372332 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.372349 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.372363 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:12Z","lastTransitionTime":"2025-11-25T16:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.474791 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.474873 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.474884 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.474903 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.474916 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:12Z","lastTransitionTime":"2025-11-25T16:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.578484 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.578560 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.578584 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.578663 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.578688 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:12Z","lastTransitionTime":"2025-11-25T16:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.682132 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.682206 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.682225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.682256 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.682275 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:12Z","lastTransitionTime":"2025-11-25T16:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.775028 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.775277 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:12 crc kubenswrapper[4743]: E1125 16:00:12.775451 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:12 crc kubenswrapper[4743]: E1125 16:00:12.775627 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.785663 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.785708 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.785724 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.785746 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.785763 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:12Z","lastTransitionTime":"2025-11-25T16:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.889355 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.889419 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.889438 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.889466 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.889484 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:12Z","lastTransitionTime":"2025-11-25T16:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.992421 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.992460 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.992473 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.992499 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:12 crc kubenswrapper[4743]: I1125 16:00:12.992516 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:12Z","lastTransitionTime":"2025-11-25T16:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.095294 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.095351 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.095361 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.095380 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.095391 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:13Z","lastTransitionTime":"2025-11-25T16:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.198360 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.198414 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.198425 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.198443 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.198456 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:13Z","lastTransitionTime":"2025-11-25T16:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.301329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.301387 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.301404 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.301423 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.301436 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:13Z","lastTransitionTime":"2025-11-25T16:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.403710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.403773 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.403787 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.403811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.403825 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:13Z","lastTransitionTime":"2025-11-25T16:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.506822 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.506888 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.506905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.506930 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.506947 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:13Z","lastTransitionTime":"2025-11-25T16:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.616468 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.616540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.616554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.616613 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.616629 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:13Z","lastTransitionTime":"2025-11-25T16:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.720636 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.720693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.720707 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.720730 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.720743 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:13Z","lastTransitionTime":"2025-11-25T16:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.774796 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:13 crc kubenswrapper[4743]: E1125 16:00:13.774977 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.775092 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:13 crc kubenswrapper[4743]: E1125 16:00:13.775352 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.823610 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.823670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.823681 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.823701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.823716 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:13Z","lastTransitionTime":"2025-11-25T16:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.926918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.926979 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.926990 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.927007 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:13 crc kubenswrapper[4743]: I1125 16:00:13.927021 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:13Z","lastTransitionTime":"2025-11-25T16:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.029968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.030027 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.030046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.030071 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.030085 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:14Z","lastTransitionTime":"2025-11-25T16:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.132652 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.132686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.132695 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.132709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.132719 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:14Z","lastTransitionTime":"2025-11-25T16:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.236174 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.236220 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.236236 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.236259 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.236273 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:14Z","lastTransitionTime":"2025-11-25T16:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.339032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.339074 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.339105 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.339135 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.339148 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:14Z","lastTransitionTime":"2025-11-25T16:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.442041 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.442337 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.442455 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.442542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.442694 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:14Z","lastTransitionTime":"2025-11-25T16:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.545835 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.545889 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.545907 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.545932 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.545950 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:14Z","lastTransitionTime":"2025-11-25T16:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.649127 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.649210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.649238 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.649277 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.649300 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:14Z","lastTransitionTime":"2025-11-25T16:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.752095 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.752396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.752479 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.752701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.752813 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:14Z","lastTransitionTime":"2025-11-25T16:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.774641 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.774713 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:14 crc kubenswrapper[4743]: E1125 16:00:14.774825 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:14 crc kubenswrapper[4743]: E1125 16:00:14.774950 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.856075 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.856450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.856550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.856682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.856789 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:14Z","lastTransitionTime":"2025-11-25T16:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.959785 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.959822 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.959832 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.959848 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:14 crc kubenswrapper[4743]: I1125 16:00:14.959859 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:14Z","lastTransitionTime":"2025-11-25T16:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.062665 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.062707 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.062751 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.062773 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.062784 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:15Z","lastTransitionTime":"2025-11-25T16:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.165833 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.165894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.165910 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.165934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.165948 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:15Z","lastTransitionTime":"2025-11-25T16:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.268363 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.268409 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.268420 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.268438 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.268448 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:15Z","lastTransitionTime":"2025-11-25T16:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.372004 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.372109 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.372133 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.372164 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.372188 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:15Z","lastTransitionTime":"2025-11-25T16:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.475502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.475568 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.475605 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.475625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.475636 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:15Z","lastTransitionTime":"2025-11-25T16:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.578175 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.578225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.578239 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.578258 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.578273 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:15Z","lastTransitionTime":"2025-11-25T16:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.681150 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.681196 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.681207 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.681226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.681240 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:15Z","lastTransitionTime":"2025-11-25T16:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.774728 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.774781 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:15 crc kubenswrapper[4743]: E1125 16:00:15.774919 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:15 crc kubenswrapper[4743]: E1125 16:00:15.775531 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.775840 4743 scope.go:117] "RemoveContainer" containerID="30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3" Nov 25 16:00:15 crc kubenswrapper[4743]: E1125 16:00:15.776088 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.783507 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.783558 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.783573 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.783608 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.783622 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:15Z","lastTransitionTime":"2025-11-25T16:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.886454 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.886497 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.886515 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.886540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.886555 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:15Z","lastTransitionTime":"2025-11-25T16:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.989865 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.989920 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.989933 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.989955 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:15 crc kubenswrapper[4743]: I1125 16:00:15.989968 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:15Z","lastTransitionTime":"2025-11-25T16:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.095267 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.095524 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.095538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.095851 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.095912 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:16Z","lastTransitionTime":"2025-11-25T16:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.198881 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.199148 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.199236 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.199329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.199402 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:16Z","lastTransitionTime":"2025-11-25T16:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.302344 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.302390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.302401 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.302417 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.302428 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:16Z","lastTransitionTime":"2025-11-25T16:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.404996 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.405045 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.405057 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.405076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.405090 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:16Z","lastTransitionTime":"2025-11-25T16:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.507285 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.507327 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.507336 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.507351 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.507362 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:16Z","lastTransitionTime":"2025-11-25T16:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.610718 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.610760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.610770 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.610786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.610796 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:16Z","lastTransitionTime":"2025-11-25T16:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.713146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.713185 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.713193 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.713209 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.713219 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:16Z","lastTransitionTime":"2025-11-25T16:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.774716 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.774834 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:16 crc kubenswrapper[4743]: E1125 16:00:16.774893 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:16 crc kubenswrapper[4743]: E1125 16:00:16.775035 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.816091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.816132 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.816147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.816171 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.816184 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:16Z","lastTransitionTime":"2025-11-25T16:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.918745 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.918789 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.918798 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.918814 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:16 crc kubenswrapper[4743]: I1125 16:00:16.918828 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:16Z","lastTransitionTime":"2025-11-25T16:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.021515 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.021560 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.021570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.021606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.021621 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:17Z","lastTransitionTime":"2025-11-25T16:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.123673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.123710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.123719 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.123735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.123745 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:17Z","lastTransitionTime":"2025-11-25T16:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.226883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.226935 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.226950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.226979 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.226997 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:17Z","lastTransitionTime":"2025-11-25T16:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.330114 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.330147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.330159 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.330179 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.330194 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:17Z","lastTransitionTime":"2025-11-25T16:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.433295 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.433346 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.433354 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.433372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.433382 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:17Z","lastTransitionTime":"2025-11-25T16:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.535886 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.535925 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.535934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.535951 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.535961 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:17Z","lastTransitionTime":"2025-11-25T16:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.639018 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.639083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.639097 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.639116 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.639128 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:17Z","lastTransitionTime":"2025-11-25T16:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.742343 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.742396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.742406 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.742424 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.742439 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:17Z","lastTransitionTime":"2025-11-25T16:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.774557 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.774667 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:17 crc kubenswrapper[4743]: E1125 16:00:17.774702 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:17 crc kubenswrapper[4743]: E1125 16:00:17.774845 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.845202 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.845273 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.845286 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.845310 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.845329 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:17Z","lastTransitionTime":"2025-11-25T16:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.948522 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.948642 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.948658 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.948682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:17 crc kubenswrapper[4743]: I1125 16:00:17.948698 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:17Z","lastTransitionTime":"2025-11-25T16:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.051741 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.051790 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.051806 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.051829 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.051848 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:18Z","lastTransitionTime":"2025-11-25T16:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.154329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.154372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.154383 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.154403 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.154416 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:18Z","lastTransitionTime":"2025-11-25T16:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.257286 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.257338 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.257350 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.257372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.257384 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:18Z","lastTransitionTime":"2025-11-25T16:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.360262 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.360315 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.360326 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.360347 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.360365 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:18Z","lastTransitionTime":"2025-11-25T16:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.463305 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.463364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.463403 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.463428 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.463443 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:18Z","lastTransitionTime":"2025-11-25T16:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.566670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.566722 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.566734 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.566758 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.566770 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:18Z","lastTransitionTime":"2025-11-25T16:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.673727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.673882 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.673896 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.673920 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.673940 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:18Z","lastTransitionTime":"2025-11-25T16:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.774499 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.774541 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:18 crc kubenswrapper[4743]: E1125 16:00:18.774703 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:18 crc kubenswrapper[4743]: E1125 16:00:18.774790 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.777786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.777828 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.777841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.777860 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.777876 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:18Z","lastTransitionTime":"2025-11-25T16:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.880911 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.880976 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.880989 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.881011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.881026 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:18Z","lastTransitionTime":"2025-11-25T16:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.984332 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.984402 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.984419 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.984646 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:18 crc kubenswrapper[4743]: I1125 16:00:18.984672 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:18Z","lastTransitionTime":"2025-11-25T16:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.086805 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.086860 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.086872 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.086932 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.086957 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:19Z","lastTransitionTime":"2025-11-25T16:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.189425 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.189503 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.189517 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.189541 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.189557 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:19Z","lastTransitionTime":"2025-11-25T16:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.199482 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.199536 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.199551 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.199570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.199582 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T16:00:19Z","lastTransitionTime":"2025-11-25T16:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.244518 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j"] Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.245093 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.247322 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.247921 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.247947 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.248159 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.261985 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=51.261961093 podStartE2EDuration="51.261961093s" podCreationTimestamp="2025-11-25 15:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:00:19.261051444 +0000 UTC m=+98.382891013" watchObservedRunningTime="2025-11-25 16:00:19.261961093 +0000 UTC m=+98.383800632" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.297959 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6xggw" podStartSLOduration=75.297933855 podStartE2EDuration="1m15.297933855s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:00:19.297935246 +0000 UTC m=+98.419774815" watchObservedRunningTime="2025-11-25 16:00:19.297933855 +0000 UTC m=+98.419773404" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.298205 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=19.298199374 podStartE2EDuration="19.298199374s" podCreationTimestamp="2025-11-25 16:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:00:19.286562198 +0000 UTC m=+98.408401747" watchObservedRunningTime="2025-11-25 16:00:19.298199374 +0000 UTC m=+98.420038923" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.331620 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-n2r2l" podStartSLOduration=75.331557252 podStartE2EDuration="1m15.331557252s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:00:19.311547186 +0000 UTC m=+98.433386745" watchObservedRunningTime="2025-11-25 16:00:19.331557252 +0000 UTC m=+98.453396801" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.342573 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/697b6319-89fd-481b-aba5-a22cd031ee1c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-z464j\" (UID: \"697b6319-89fd-481b-aba5-a22cd031ee1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.342637 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/697b6319-89fd-481b-aba5-a22cd031ee1c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-z464j\" (UID: \"697b6319-89fd-481b-aba5-a22cd031ee1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.342661 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/697b6319-89fd-481b-aba5-a22cd031ee1c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-z464j\" (UID: \"697b6319-89fd-481b-aba5-a22cd031ee1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.342679 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/697b6319-89fd-481b-aba5-a22cd031ee1c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-z464j\" (UID: \"697b6319-89fd-481b-aba5-a22cd031ee1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.342872 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/697b6319-89fd-481b-aba5-a22cd031ee1c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-z464j\" (UID: \"697b6319-89fd-481b-aba5-a22cd031ee1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.346368 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ntxtl" podStartSLOduration=75.346352191 podStartE2EDuration="1m15.346352191s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:00:19.331900724 +0000 UTC m=+98.453740323" watchObservedRunningTime="2025-11-25 16:00:19.346352191 +0000 UTC m=+98.468191740" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.346542 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fhb8n" podStartSLOduration=75.346538037 podStartE2EDuration="1m15.346538037s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:00:19.346447034 +0000 UTC m=+98.468286583" watchObservedRunningTime="2025-11-25 16:00:19.346538037 +0000 UTC m=+98.468377576" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.362420 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=74.362390299 podStartE2EDuration="1m14.362390299s" podCreationTimestamp="2025-11-25 15:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:00:19.362001666 +0000 UTC m=+98.483841235" watchObservedRunningTime="2025-11-25 16:00:19.362390299 +0000 UTC m=+98.484229848" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.444280 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/697b6319-89fd-481b-aba5-a22cd031ee1c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-z464j\" (UID: \"697b6319-89fd-481b-aba5-a22cd031ee1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.444326 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/697b6319-89fd-481b-aba5-a22cd031ee1c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-z464j\" (UID: \"697b6319-89fd-481b-aba5-a22cd031ee1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.444346 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/697b6319-89fd-481b-aba5-a22cd031ee1c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-z464j\" (UID: \"697b6319-89fd-481b-aba5-a22cd031ee1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.444365 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/697b6319-89fd-481b-aba5-a22cd031ee1c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-z464j\" (UID: \"697b6319-89fd-481b-aba5-a22cd031ee1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.444398 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/697b6319-89fd-481b-aba5-a22cd031ee1c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-z464j\" (UID: \"697b6319-89fd-481b-aba5-a22cd031ee1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.444414 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/697b6319-89fd-481b-aba5-a22cd031ee1c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-z464j\" (UID: \"697b6319-89fd-481b-aba5-a22cd031ee1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.444725 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/697b6319-89fd-481b-aba5-a22cd031ee1c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-z464j\" (UID: \"697b6319-89fd-481b-aba5-a22cd031ee1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.445770 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/697b6319-89fd-481b-aba5-a22cd031ee1c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-z464j\" (UID: \"697b6319-89fd-481b-aba5-a22cd031ee1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.451888 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/697b6319-89fd-481b-aba5-a22cd031ee1c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-z464j\" (UID: \"697b6319-89fd-481b-aba5-a22cd031ee1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.458728 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=79.458697802 podStartE2EDuration="1m19.458697802s" podCreationTimestamp="2025-11-25 15:59:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:00:19.458660341 +0000 UTC m=+98.580499910" watchObservedRunningTime="2025-11-25 16:00:19.458697802 +0000 UTC m=+98.580537371" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.464473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/697b6319-89fd-481b-aba5-a22cd031ee1c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-z464j\" (UID: \"697b6319-89fd-481b-aba5-a22cd031ee1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.532215 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podStartSLOduration=75.532191828 podStartE2EDuration="1m15.532191828s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:00:19.519019562 +0000 UTC m=+98.640859121" watchObservedRunningTime="2025-11-25 16:00:19.532191828 +0000 UTC m=+98.654031377" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.532654 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zxxwm" podStartSLOduration=75.532649143 podStartE2EDuration="1m15.532649143s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:00:19.531802196 +0000 UTC m=+98.653641765" watchObservedRunningTime="2025-11-25 16:00:19.532649143 +0000 UTC m=+98.654488692" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.560449 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.564071 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.564043737 podStartE2EDuration="1m19.564043737s" podCreationTimestamp="2025-11-25 15:59:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:00:19.562969163 +0000 UTC m=+98.684808732" watchObservedRunningTime="2025-11-25 16:00:19.564043737 +0000 UTC m=+98.685883286" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.774642 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:19 crc kubenswrapper[4743]: I1125 16:00:19.774836 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:19 crc kubenswrapper[4743]: E1125 16:00:19.775305 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:19 crc kubenswrapper[4743]: E1125 16:00:19.775470 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:20 crc kubenswrapper[4743]: I1125 16:00:20.288856 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" event={"ID":"697b6319-89fd-481b-aba5-a22cd031ee1c","Type":"ContainerStarted","Data":"e034fcef6ecf46e469d71cfae17fc9739e136a6d142f02c5df3accc0335062aa"} Nov 25 16:00:20 crc kubenswrapper[4743]: I1125 16:00:20.288920 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" event={"ID":"697b6319-89fd-481b-aba5-a22cd031ee1c","Type":"ContainerStarted","Data":"12f7bd423bd7df584d49e7fd754feb398f233b411060a9f2dc12eac7b2db4000"} Nov 25 16:00:20 crc kubenswrapper[4743]: I1125 16:00:20.774324 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:20 crc kubenswrapper[4743]: I1125 16:00:20.774500 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:20 crc kubenswrapper[4743]: E1125 16:00:20.774652 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:20 crc kubenswrapper[4743]: E1125 16:00:20.774839 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:21 crc kubenswrapper[4743]: I1125 16:00:21.774555 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:21 crc kubenswrapper[4743]: I1125 16:00:21.775304 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:21 crc kubenswrapper[4743]: E1125 16:00:21.775419 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:21 crc kubenswrapper[4743]: E1125 16:00:21.775615 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:22 crc kubenswrapper[4743]: I1125 16:00:22.774167 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:22 crc kubenswrapper[4743]: I1125 16:00:22.774248 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:22 crc kubenswrapper[4743]: E1125 16:00:22.774542 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:22 crc kubenswrapper[4743]: E1125 16:00:22.774723 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:22 crc kubenswrapper[4743]: I1125 16:00:22.786126 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs\") pod \"network-metrics-daemon-s9t79\" (UID: \"617512f9-f767-4615-a9d2-132c6c73a69d\") " pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:22 crc kubenswrapper[4743]: E1125 16:00:22.786385 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 16:00:22 crc kubenswrapper[4743]: E1125 16:00:22.786517 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs podName:617512f9-f767-4615-a9d2-132c6c73a69d nodeName:}" failed. No retries permitted until 2025-11-25 16:01:26.786480099 +0000 UTC m=+165.908319828 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs") pod "network-metrics-daemon-s9t79" (UID: "617512f9-f767-4615-a9d2-132c6c73a69d") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 16:00:23 crc kubenswrapper[4743]: I1125 16:00:23.774355 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:23 crc kubenswrapper[4743]: I1125 16:00:23.774417 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:23 crc kubenswrapper[4743]: E1125 16:00:23.774535 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:23 crc kubenswrapper[4743]: E1125 16:00:23.774890 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:24 crc kubenswrapper[4743]: I1125 16:00:24.774360 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:24 crc kubenswrapper[4743]: E1125 16:00:24.774502 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:24 crc kubenswrapper[4743]: I1125 16:00:24.774503 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:24 crc kubenswrapper[4743]: E1125 16:00:24.774743 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:25 crc kubenswrapper[4743]: I1125 16:00:25.774548 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:25 crc kubenswrapper[4743]: I1125 16:00:25.774640 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:25 crc kubenswrapper[4743]: E1125 16:00:25.774797 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:25 crc kubenswrapper[4743]: E1125 16:00:25.775182 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:26 crc kubenswrapper[4743]: I1125 16:00:26.774900 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:26 crc kubenswrapper[4743]: E1125 16:00:26.775542 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:26 crc kubenswrapper[4743]: I1125 16:00:26.774991 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:26 crc kubenswrapper[4743]: E1125 16:00:26.775841 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:26 crc kubenswrapper[4743]: I1125 16:00:26.775728 4743 scope.go:117] "RemoveContainer" containerID="30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3" Nov 25 16:00:26 crc kubenswrapper[4743]: E1125 16:00:26.776384 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" Nov 25 16:00:27 crc kubenswrapper[4743]: I1125 16:00:27.774249 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:27 crc kubenswrapper[4743]: I1125 16:00:27.774245 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:27 crc kubenswrapper[4743]: E1125 16:00:27.774492 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:27 crc kubenswrapper[4743]: E1125 16:00:27.774691 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:28 crc kubenswrapper[4743]: I1125 16:00:28.774195 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:28 crc kubenswrapper[4743]: I1125 16:00:28.774294 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:28 crc kubenswrapper[4743]: E1125 16:00:28.774362 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:28 crc kubenswrapper[4743]: E1125 16:00:28.774494 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:29 crc kubenswrapper[4743]: I1125 16:00:29.774782 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:29 crc kubenswrapper[4743]: I1125 16:00:29.774782 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:29 crc kubenswrapper[4743]: E1125 16:00:29.774941 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:29 crc kubenswrapper[4743]: E1125 16:00:29.774996 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:30 crc kubenswrapper[4743]: I1125 16:00:30.773968 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:30 crc kubenswrapper[4743]: E1125 16:00:30.774139 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:30 crc kubenswrapper[4743]: I1125 16:00:30.773970 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:30 crc kubenswrapper[4743]: E1125 16:00:30.774291 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:31 crc kubenswrapper[4743]: I1125 16:00:31.774940 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:31 crc kubenswrapper[4743]: I1125 16:00:31.775060 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:31 crc kubenswrapper[4743]: E1125 16:00:31.776549 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:31 crc kubenswrapper[4743]: E1125 16:00:31.776733 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:32 crc kubenswrapper[4743]: I1125 16:00:32.774718 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:32 crc kubenswrapper[4743]: I1125 16:00:32.774818 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:32 crc kubenswrapper[4743]: E1125 16:00:32.774917 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:32 crc kubenswrapper[4743]: E1125 16:00:32.775097 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:33 crc kubenswrapper[4743]: I1125 16:00:33.774173 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:33 crc kubenswrapper[4743]: I1125 16:00:33.774191 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:33 crc kubenswrapper[4743]: E1125 16:00:33.774508 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:33 crc kubenswrapper[4743]: E1125 16:00:33.774666 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:34 crc kubenswrapper[4743]: I1125 16:00:34.774763 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:34 crc kubenswrapper[4743]: I1125 16:00:34.774936 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:34 crc kubenswrapper[4743]: E1125 16:00:34.774973 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:34 crc kubenswrapper[4743]: E1125 16:00:34.775266 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:35 crc kubenswrapper[4743]: I1125 16:00:35.774416 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:35 crc kubenswrapper[4743]: I1125 16:00:35.774462 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:35 crc kubenswrapper[4743]: E1125 16:00:35.774560 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:35 crc kubenswrapper[4743]: E1125 16:00:35.774647 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:36 crc kubenswrapper[4743]: I1125 16:00:36.774531 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:36 crc kubenswrapper[4743]: I1125 16:00:36.774631 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:36 crc kubenswrapper[4743]: E1125 16:00:36.774793 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:36 crc kubenswrapper[4743]: E1125 16:00:36.775012 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:37 crc kubenswrapper[4743]: I1125 16:00:37.774349 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:37 crc kubenswrapper[4743]: E1125 16:00:37.775233 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:37 crc kubenswrapper[4743]: I1125 16:00:37.774349 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:37 crc kubenswrapper[4743]: E1125 16:00:37.775582 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:38 crc kubenswrapper[4743]: I1125 16:00:38.774743 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:38 crc kubenswrapper[4743]: I1125 16:00:38.774863 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:38 crc kubenswrapper[4743]: E1125 16:00:38.774932 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:38 crc kubenswrapper[4743]: E1125 16:00:38.775178 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:39 crc kubenswrapper[4743]: I1125 16:00:39.358351 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n2r2l_2175b34c-5202-4e94-af0e-2f879b98c0bc/kube-multus/1.log" Nov 25 16:00:39 crc kubenswrapper[4743]: I1125 16:00:39.358937 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n2r2l_2175b34c-5202-4e94-af0e-2f879b98c0bc/kube-multus/0.log" Nov 25 16:00:39 crc kubenswrapper[4743]: I1125 16:00:39.359056 4743 generic.go:334] "Generic (PLEG): container finished" podID="2175b34c-5202-4e94-af0e-2f879b98c0bc" containerID="47d4adf248256da18201eea949e15ec4471560028e06db3be072d6325667fc19" exitCode=1 Nov 25 16:00:39 crc kubenswrapper[4743]: I1125 16:00:39.359138 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n2r2l" event={"ID":"2175b34c-5202-4e94-af0e-2f879b98c0bc","Type":"ContainerDied","Data":"47d4adf248256da18201eea949e15ec4471560028e06db3be072d6325667fc19"} Nov 25 16:00:39 crc kubenswrapper[4743]: I1125 16:00:39.359197 4743 scope.go:117] "RemoveContainer" containerID="1dfcd66ee1cb77e6f81aebf32ca8a91893a2206c634458590e2a1851d886f94e" Nov 25 16:00:39 crc kubenswrapper[4743]: I1125 16:00:39.360057 4743 scope.go:117] "RemoveContainer" containerID="47d4adf248256da18201eea949e15ec4471560028e06db3be072d6325667fc19" Nov 25 16:00:39 crc kubenswrapper[4743]: E1125 16:00:39.360354 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-n2r2l_openshift-multus(2175b34c-5202-4e94-af0e-2f879b98c0bc)\"" pod="openshift-multus/multus-n2r2l" podUID="2175b34c-5202-4e94-af0e-2f879b98c0bc" Nov 25 16:00:39 crc kubenswrapper[4743]: I1125 16:00:39.378391 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z464j" podStartSLOduration=95.378367099 podStartE2EDuration="1m35.378367099s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:00:20.302007882 +0000 UTC m=+99.423847441" watchObservedRunningTime="2025-11-25 16:00:39.378367099 +0000 UTC m=+118.500206648" Nov 25 16:00:39 crc kubenswrapper[4743]: I1125 16:00:39.774540 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:39 crc kubenswrapper[4743]: I1125 16:00:39.774634 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:39 crc kubenswrapper[4743]: E1125 16:00:39.774833 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:39 crc kubenswrapper[4743]: E1125 16:00:39.775250 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:39 crc kubenswrapper[4743]: I1125 16:00:39.775670 4743 scope.go:117] "RemoveContainer" containerID="30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3" Nov 25 16:00:39 crc kubenswrapper[4743]: E1125 16:00:39.775859 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pbbjc_openshift-ovn-kubernetes(d04400c3-4f05-4be2-b759-a60cec0746ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" Nov 25 16:00:40 crc kubenswrapper[4743]: I1125 16:00:40.365080 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n2r2l_2175b34c-5202-4e94-af0e-2f879b98c0bc/kube-multus/1.log" Nov 25 16:00:40 crc kubenswrapper[4743]: I1125 16:00:40.773981 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:40 crc kubenswrapper[4743]: I1125 16:00:40.774080 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:40 crc kubenswrapper[4743]: E1125 16:00:40.774286 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:40 crc kubenswrapper[4743]: E1125 16:00:40.774397 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:41 crc kubenswrapper[4743]: I1125 16:00:41.774182 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:41 crc kubenswrapper[4743]: I1125 16:00:41.774217 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:41 crc kubenswrapper[4743]: E1125 16:00:41.775312 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:41 crc kubenswrapper[4743]: E1125 16:00:41.775560 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:41 crc kubenswrapper[4743]: E1125 16:00:41.809219 4743 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 25 16:00:41 crc kubenswrapper[4743]: E1125 16:00:41.900926 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 16:00:42 crc kubenswrapper[4743]: I1125 16:00:42.774793 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:42 crc kubenswrapper[4743]: I1125 16:00:42.774825 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:42 crc kubenswrapper[4743]: E1125 16:00:42.774958 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:42 crc kubenswrapper[4743]: E1125 16:00:42.775085 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:43 crc kubenswrapper[4743]: I1125 16:00:43.774144 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:43 crc kubenswrapper[4743]: E1125 16:00:43.774290 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:43 crc kubenswrapper[4743]: I1125 16:00:43.774173 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:43 crc kubenswrapper[4743]: E1125 16:00:43.774402 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:44 crc kubenswrapper[4743]: I1125 16:00:44.774784 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:44 crc kubenswrapper[4743]: I1125 16:00:44.774847 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:44 crc kubenswrapper[4743]: E1125 16:00:44.775079 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:44 crc kubenswrapper[4743]: E1125 16:00:44.775217 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:45 crc kubenswrapper[4743]: I1125 16:00:45.774386 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:45 crc kubenswrapper[4743]: I1125 16:00:45.774398 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:45 crc kubenswrapper[4743]: E1125 16:00:45.774570 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:45 crc kubenswrapper[4743]: E1125 16:00:45.774655 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:46 crc kubenswrapper[4743]: I1125 16:00:46.774307 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:46 crc kubenswrapper[4743]: I1125 16:00:46.774439 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:46 crc kubenswrapper[4743]: E1125 16:00:46.774775 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:46 crc kubenswrapper[4743]: E1125 16:00:46.774903 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:46 crc kubenswrapper[4743]: E1125 16:00:46.902146 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 16:00:47 crc kubenswrapper[4743]: I1125 16:00:47.773956 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:47 crc kubenswrapper[4743]: I1125 16:00:47.773956 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:47 crc kubenswrapper[4743]: E1125 16:00:47.774095 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:47 crc kubenswrapper[4743]: E1125 16:00:47.774178 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:48 crc kubenswrapper[4743]: I1125 16:00:48.773843 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:48 crc kubenswrapper[4743]: I1125 16:00:48.773957 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:48 crc kubenswrapper[4743]: E1125 16:00:48.774018 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:48 crc kubenswrapper[4743]: E1125 16:00:48.774183 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:49 crc kubenswrapper[4743]: I1125 16:00:49.774038 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:49 crc kubenswrapper[4743]: I1125 16:00:49.774172 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:49 crc kubenswrapper[4743]: E1125 16:00:49.774321 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:49 crc kubenswrapper[4743]: E1125 16:00:49.774407 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:50 crc kubenswrapper[4743]: I1125 16:00:50.774838 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:50 crc kubenswrapper[4743]: I1125 16:00:50.774896 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:50 crc kubenswrapper[4743]: E1125 16:00:50.775047 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:50 crc kubenswrapper[4743]: E1125 16:00:50.775356 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:50 crc kubenswrapper[4743]: I1125 16:00:50.775501 4743 scope.go:117] "RemoveContainer" containerID="47d4adf248256da18201eea949e15ec4471560028e06db3be072d6325667fc19" Nov 25 16:00:51 crc kubenswrapper[4743]: I1125 16:00:51.774333 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:51 crc kubenswrapper[4743]: I1125 16:00:51.774461 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:51 crc kubenswrapper[4743]: E1125 16:00:51.775929 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:51 crc kubenswrapper[4743]: E1125 16:00:51.776125 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:51 crc kubenswrapper[4743]: I1125 16:00:51.776178 4743 scope.go:117] "RemoveContainer" containerID="30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3" Nov 25 16:00:51 crc kubenswrapper[4743]: E1125 16:00:51.903088 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 16:00:52 crc kubenswrapper[4743]: I1125 16:00:52.414575 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n2r2l_2175b34c-5202-4e94-af0e-2f879b98c0bc/kube-multus/1.log" Nov 25 16:00:52 crc kubenswrapper[4743]: I1125 16:00:52.414749 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n2r2l" event={"ID":"2175b34c-5202-4e94-af0e-2f879b98c0bc","Type":"ContainerStarted","Data":"3d369a5a2f039f3e389943653b6aee59a909ef60ee893769d64b9c6016f61900"} Nov 25 16:00:52 crc kubenswrapper[4743]: I1125 16:00:52.418094 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovnkube-controller/3.log" Nov 25 16:00:52 crc kubenswrapper[4743]: I1125 16:00:52.421699 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerStarted","Data":"5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9"} Nov 25 16:00:52 crc kubenswrapper[4743]: I1125 16:00:52.422180 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 16:00:52 crc kubenswrapper[4743]: I1125 16:00:52.489770 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podStartSLOduration=108.489751609 podStartE2EDuration="1m48.489751609s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:00:52.489090651 +0000 UTC m=+131.610930220" watchObservedRunningTime="2025-11-25 16:00:52.489751609 +0000 UTC m=+131.611591158" Nov 25 16:00:52 crc kubenswrapper[4743]: I1125 16:00:52.774049 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:52 crc kubenswrapper[4743]: I1125 16:00:52.774120 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:52 crc kubenswrapper[4743]: E1125 16:00:52.774875 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:52 crc kubenswrapper[4743]: E1125 16:00:52.775165 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:52 crc kubenswrapper[4743]: I1125 16:00:52.896203 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-s9t79"] Nov 25 16:00:53 crc kubenswrapper[4743]: I1125 16:00:53.425823 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:53 crc kubenswrapper[4743]: E1125 16:00:53.426493 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:53 crc kubenswrapper[4743]: I1125 16:00:53.774982 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:53 crc kubenswrapper[4743]: I1125 16:00:53.775078 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:53 crc kubenswrapper[4743]: E1125 16:00:53.775147 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:53 crc kubenswrapper[4743]: E1125 16:00:53.775207 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:54 crc kubenswrapper[4743]: I1125 16:00:54.774686 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:54 crc kubenswrapper[4743]: I1125 16:00:54.774724 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:54 crc kubenswrapper[4743]: E1125 16:00:54.774853 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:54 crc kubenswrapper[4743]: E1125 16:00:54.774992 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:55 crc kubenswrapper[4743]: I1125 16:00:55.774882 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:55 crc kubenswrapper[4743]: I1125 16:00:55.774882 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:55 crc kubenswrapper[4743]: E1125 16:00:55.775280 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 16:00:55 crc kubenswrapper[4743]: E1125 16:00:55.775352 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 16:00:56 crc kubenswrapper[4743]: I1125 16:00:56.774507 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:56 crc kubenswrapper[4743]: I1125 16:00:56.774632 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:56 crc kubenswrapper[4743]: E1125 16:00:56.774657 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 16:00:56 crc kubenswrapper[4743]: E1125 16:00:56.774833 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s9t79" podUID="617512f9-f767-4615-a9d2-132c6c73a69d" Nov 25 16:00:57 crc kubenswrapper[4743]: I1125 16:00:57.774025 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:00:57 crc kubenswrapper[4743]: I1125 16:00:57.774093 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:00:57 crc kubenswrapper[4743]: I1125 16:00:57.775998 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 25 16:00:57 crc kubenswrapper[4743]: I1125 16:00:57.776065 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 25 16:00:57 crc kubenswrapper[4743]: I1125 16:00:57.777215 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 25 16:00:57 crc kubenswrapper[4743]: I1125 16:00:57.778168 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 25 16:00:58 crc kubenswrapper[4743]: I1125 16:00:58.774740 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:00:58 crc kubenswrapper[4743]: I1125 16:00:58.775478 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:00:58 crc kubenswrapper[4743]: I1125 16:00:58.777474 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 25 16:00:58 crc kubenswrapper[4743]: I1125 16:00:58.777766 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.011653 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.052149 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5n9pg"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.052616 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.053139 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gwtwj"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.053440 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.054302 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zjh85"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.054576 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.054977 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.055231 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.055928 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-w8pxz"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.056372 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.056505 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-j9hhf"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.057053 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.063716 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.064240 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.071507 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.073418 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.075436 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.075725 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.076202 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.077218 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.077385 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.077505 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.077713 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.078129 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.089241 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.089365 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.089498 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.089732 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.089875 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.090436 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.090606 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.090729 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.090787 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.090972 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.091082 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qph9q"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.091129 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.091224 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.091320 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.091396 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.091483 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.091525 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.091697 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qph9q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.096151 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.096235 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.096171 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.096361 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.096380 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.096454 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.096465 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.096570 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.096742 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.096814 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.096853 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.096964 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.097087 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.097105 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.097236 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.099520 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.099897 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.100426 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-55g75"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.100731 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-55g75" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.101209 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.101576 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.101782 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.101882 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.101964 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.102856 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.105339 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.105889 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.106045 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.106081 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.106908 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.109794 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.110163 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.110400 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.110612 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.110737 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.111212 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.112161 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.112585 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.114382 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.114442 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.114385 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.117126 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.117504 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-4sghb"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.117890 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.117893 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.117992 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.124663 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.124784 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.125563 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.128269 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.129334 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.129518 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.130038 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.134616 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.134977 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.135132 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.135255 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.135477 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tjsl5"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.141795 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.142611 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.143847 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-td7r9"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.144314 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.147946 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.149495 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.150000 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.150153 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.150360 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.150535 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.151028 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jxd8\" (UniqueName: \"kubernetes.io/projected/766093a7-4c21-4cd4-b8b1-074140a620c9-kube-api-access-2jxd8\") pod \"openshift-apiserver-operator-796bbdcf4f-rdvpf\" (UID: \"766093a7-4c21-4cd4-b8b1-074140a620c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.151167 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/766093a7-4c21-4cd4-b8b1-074140a620c9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rdvpf\" (UID: \"766093a7-4c21-4cd4-b8b1-074140a620c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.151249 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/766093a7-4c21-4cd4-b8b1-074140a620c9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rdvpf\" (UID: \"766093a7-4c21-4cd4-b8b1-074140a620c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.153042 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.153356 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.153491 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.153581 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.154240 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.160133 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.160892 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.161492 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-krv4b"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.162163 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.162278 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-krv4b" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.162333 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.162444 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.162553 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.162693 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.162760 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.162894 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.162979 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.163294 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.163500 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.163700 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.163856 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.163902 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.163994 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.162553 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.164128 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.163712 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.164571 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.164664 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.164778 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.164910 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.164991 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.165087 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.165158 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.165721 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.167044 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.167642 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.169104 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.171239 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.176353 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.176644 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-84rbl"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.177324 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-ldzq4"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.177694 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.177973 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-84rbl" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.178252 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.178969 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.180694 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.181638 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.181978 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6mhkh"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.182609 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6mhkh" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.182991 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.183218 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.183745 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.183755 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.184279 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.199321 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.201979 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zjh85"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.203252 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4wvzk"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.208939 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4wvzk" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.209665 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5n9pg"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.213094 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.215185 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.216219 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.217448 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.218236 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8vr4f"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.225549 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8vr4f" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.226092 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-n2vkg"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.231709 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.234365 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.236090 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.242898 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.243818 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.248096 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2gkgd"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.248759 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pbz24"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.249230 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.249806 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.249811 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2gkgd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.249899 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.251445 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gwtwj"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.251760 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdsr6\" (UniqueName: \"kubernetes.io/projected/6007cac9-0c7c-4d71-b65c-aa10735ecce4-kube-api-access-mdsr6\") pod \"cluster-samples-operator-665b6dd947-qph9q\" (UID: \"6007cac9-0c7c-4d71-b65c-aa10735ecce4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qph9q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.251790 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5593b1c-91e9-48c0-b348-cd0a46f64639-config\") pod \"route-controller-manager-6576b87f9c-w8xnw\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.251814 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5dk5\" (UniqueName: \"kubernetes.io/projected/090b15f7-96ff-4c51-ac41-59d1ed0c66a7-kube-api-access-d5dk5\") pod \"cluster-image-registry-operator-dc59b4c8b-dmmkv\" (UID: \"090b15f7-96ff-4c51-ac41-59d1ed0c66a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.251832 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89cbcba8-bb80-4471-b57f-55cd045b68d9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-phb4q\" (UID: \"89cbcba8-bb80-4471-b57f-55cd045b68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.251848 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.251869 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e800807-1cef-4dcb-9001-48322127beb9-serving-cert\") pod \"controller-manager-879f6c89f-5n9pg\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.251888 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.251905 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-encryption-config\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.251923 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svdqx\" (UniqueName: \"kubernetes.io/projected/2e800807-1cef-4dcb-9001-48322127beb9-kube-api-access-svdqx\") pod \"controller-manager-879f6c89f-5n9pg\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.251941 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.251958 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-audit-policies\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.251973 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7d64e696-446f-4e5d-a276-4a9f18f291b2-auth-proxy-config\") pod \"machine-approver-56656f9798-l4t62\" (UID: \"7d64e696-446f-4e5d-a276-4a9f18f291b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.251989 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmj97\" (UniqueName: \"kubernetes.io/projected/2b4b5943-89e2-483d-a034-1344fec03f98-kube-api-access-fmj97\") pod \"downloads-7954f5f757-55g75\" (UID: \"2b4b5943-89e2-483d-a034-1344fec03f98\") " pod="openshift-console/downloads-7954f5f757-55g75" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252008 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjgns\" (UniqueName: \"kubernetes.io/projected/95c66716-8eaa-4e63-a30d-5b871fffd090-kube-api-access-vjgns\") pod \"openshift-config-operator-7777fb866f-hmdm5\" (UID: \"95c66716-8eaa-4e63-a30d-5b871fffd090\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252026 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5n9pg\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252041 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5593b1c-91e9-48c0-b348-cd0a46f64639-client-ca\") pod \"route-controller-manager-6576b87f9c-w8xnw\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252060 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-serving-cert\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252079 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/50460f60-2828-42ab-94aa-3ae9d13a5a1e-etcd-serving-ca\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252095 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvz42\" (UniqueName: \"kubernetes.io/projected/50460f60-2828-42ab-94aa-3ae9d13a5a1e-kube-api-access-bvz42\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252110 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/95c66716-8eaa-4e63-a30d-5b871fffd090-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hmdm5\" (UID: \"95c66716-8eaa-4e63-a30d-5b871fffd090\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252142 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89cbcba8-bb80-4471-b57f-55cd045b68d9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-phb4q\" (UID: \"89cbcba8-bb80-4471-b57f-55cd045b68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252157 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzmr2\" (UniqueName: \"kubernetes.io/projected/b7e20dd3-f239-419d-bc24-5e38d66e7803-kube-api-access-wzmr2\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252174 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc5c20fa-6555-4502-8d1e-620e985c9607-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gwtwj\" (UID: \"fc5c20fa-6555-4502-8d1e-620e985c9607\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252190 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-config\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252208 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-oauth-serving-cert\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252482 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252533 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a655856c-3900-4342-a094-dc03b84c8876-images\") pod \"machine-api-operator-5694c8668f-j9hhf\" (UID: \"a655856c-3900-4342-a094-dc03b84c8876\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252778 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-client-ca\") pod \"controller-manager-879f6c89f-5n9pg\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252809 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252840 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-trusted-ca-bundle\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252955 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.252987 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/766093a7-4c21-4cd4-b8b1-074140a620c9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rdvpf\" (UID: \"766093a7-4c21-4cd4-b8b1-074140a620c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253029 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/50460f60-2828-42ab-94aa-3ae9d13a5a1e-etcd-client\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253056 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253104 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c219812-f1dd-44da-9a23-764167668a0f-config\") pod \"console-operator-58897d9998-zjh85\" (UID: \"8c219812-f1dd-44da-9a23-764167668a0f\") " pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253146 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kjc5\" (UniqueName: \"kubernetes.io/projected/7d64e696-446f-4e5d-a276-4a9f18f291b2-kube-api-access-2kjc5\") pod \"machine-approver-56656f9798-l4t62\" (UID: \"7d64e696-446f-4e5d-a276-4a9f18f291b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253165 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253183 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7e20dd3-f239-419d-bc24-5e38d66e7803-audit-dir\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253199 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt8bf\" (UniqueName: \"kubernetes.io/projected/89cbcba8-bb80-4471-b57f-55cd045b68d9-kube-api-access-zt8bf\") pod \"openshift-controller-manager-operator-756b6f6bc6-phb4q\" (UID: \"89cbcba8-bb80-4471-b57f-55cd045b68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253216 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c219812-f1dd-44da-9a23-764167668a0f-serving-cert\") pod \"console-operator-58897d9998-zjh85\" (UID: \"8c219812-f1dd-44da-9a23-764167668a0f\") " pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253250 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50460f60-2828-42ab-94aa-3ae9d13a5a1e-audit-dir\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253266 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-service-ca\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253338 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50460f60-2828-42ab-94aa-3ae9d13a5a1e-config\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253358 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc5c20fa-6555-4502-8d1e-620e985c9607-serving-cert\") pod \"authentication-operator-69f744f599-gwtwj\" (UID: \"fc5c20fa-6555-4502-8d1e-620e985c9607\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253407 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jxd8\" (UniqueName: \"kubernetes.io/projected/766093a7-4c21-4cd4-b8b1-074140a620c9-kube-api-access-2jxd8\") pod \"openshift-apiserver-operator-796bbdcf4f-rdvpf\" (UID: \"766093a7-4c21-4cd4-b8b1-074140a620c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253463 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2jlj\" (UniqueName: \"kubernetes.io/projected/fc5c20fa-6555-4502-8d1e-620e985c9607-kube-api-access-l2jlj\") pod \"authentication-operator-69f744f599-gwtwj\" (UID: \"fc5c20fa-6555-4502-8d1e-620e985c9607\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253489 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-oauth-config\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253506 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/50460f60-2828-42ab-94aa-3ae9d13a5a1e-image-import-ca\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253522 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50460f60-2828-42ab-94aa-3ae9d13a5a1e-serving-cert\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253561 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d64e696-446f-4e5d-a276-4a9f18f291b2-config\") pod \"machine-approver-56656f9798-l4t62\" (UID: \"7d64e696-446f-4e5d-a276-4a9f18f291b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253614 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/766093a7-4c21-4cd4-b8b1-074140a620c9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rdvpf\" (UID: \"766093a7-4c21-4cd4-b8b1-074140a620c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253750 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/50460f60-2828-42ab-94aa-3ae9d13a5a1e-encryption-config\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253780 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2jst\" (UniqueName: \"kubernetes.io/projected/8c219812-f1dd-44da-9a23-764167668a0f-kube-api-access-b2jst\") pod \"console-operator-58897d9998-zjh85\" (UID: \"8c219812-f1dd-44da-9a23-764167668a0f\") " pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253845 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-etcd-client\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.253926 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpf42\" (UniqueName: \"kubernetes.io/projected/a655856c-3900-4342-a094-dc03b84c8876-kube-api-access-hpf42\") pod \"machine-api-operator-5694c8668f-j9hhf\" (UID: \"a655856c-3900-4342-a094-dc03b84c8876\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254112 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc5c20fa-6555-4502-8d1e-620e985c9607-service-ca-bundle\") pod \"authentication-operator-69f744f599-gwtwj\" (UID: \"fc5c20fa-6555-4502-8d1e-620e985c9607\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254139 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254223 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-audit-dir\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254289 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a655856c-3900-4342-a094-dc03b84c8876-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-j9hhf\" (UID: \"a655856c-3900-4342-a094-dc03b84c8876\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254311 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-config\") pod \"controller-manager-879f6c89f-5n9pg\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254406 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/090b15f7-96ff-4c51-ac41-59d1ed0c66a7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dmmkv\" (UID: \"090b15f7-96ff-4c51-ac41-59d1ed0c66a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254433 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50460f60-2828-42ab-94aa-3ae9d13a5a1e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254328 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254514 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254538 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95c66716-8eaa-4e63-a30d-5b871fffd090-serving-cert\") pod \"openshift-config-operator-7777fb866f-hmdm5\" (UID: \"95c66716-8eaa-4e63-a30d-5b871fffd090\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254554 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-audit-policies\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254406 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/766093a7-4c21-4cd4-b8b1-074140a620c9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rdvpf\" (UID: \"766093a7-4c21-4cd4-b8b1-074140a620c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254576 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254670 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254750 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5593b1c-91e9-48c0-b348-cd0a46f64639-serving-cert\") pod \"route-controller-manager-6576b87f9c-w8xnw\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254818 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/090b15f7-96ff-4c51-ac41-59d1ed0c66a7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dmmkv\" (UID: \"090b15f7-96ff-4c51-ac41-59d1ed0c66a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.255020 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.254944 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.255046 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zbbq\" (UniqueName: \"kubernetes.io/projected/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-kube-api-access-4zbbq\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.255353 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50460f60-2828-42ab-94aa-3ae9d13a5a1e-node-pullsecrets\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.255462 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7d64e696-446f-4e5d-a276-4a9f18f291b2-machine-approver-tls\") pod \"machine-approver-56656f9798-l4t62\" (UID: \"7d64e696-446f-4e5d-a276-4a9f18f291b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.256440 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-j9hhf"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.256478 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-serving-cert\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.256584 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl2rh\" (UniqueName: \"kubernetes.io/projected/c5593b1c-91e9-48c0-b348-cd0a46f64639-kube-api-access-gl2rh\") pod \"route-controller-manager-6576b87f9c-w8xnw\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.256646 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6007cac9-0c7c-4d71-b65c-aa10735ecce4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qph9q\" (UID: \"6007cac9-0c7c-4d71-b65c-aa10735ecce4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qph9q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.256667 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c219812-f1dd-44da-9a23-764167668a0f-trusted-ca\") pod \"console-operator-58897d9998-zjh85\" (UID: \"8c219812-f1dd-44da-9a23-764167668a0f\") " pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.256686 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.256743 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxsn7\" (UniqueName: \"kubernetes.io/projected/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-kube-api-access-xxsn7\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.256763 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a655856c-3900-4342-a094-dc03b84c8876-config\") pod \"machine-api-operator-5694c8668f-j9hhf\" (UID: \"a655856c-3900-4342-a094-dc03b84c8876\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.256796 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc5c20fa-6555-4502-8d1e-620e985c9607-config\") pod \"authentication-operator-69f744f599-gwtwj\" (UID: \"fc5c20fa-6555-4502-8d1e-620e985c9607\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.256813 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/090b15f7-96ff-4c51-ac41-59d1ed0c66a7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dmmkv\" (UID: \"090b15f7-96ff-4c51-ac41-59d1ed0c66a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.256830 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/50460f60-2828-42ab-94aa-3ae9d13a5a1e-audit\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.260570 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qph9q"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.263787 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.265768 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/766093a7-4c21-4cd4-b8b1-074140a620c9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rdvpf\" (UID: \"766093a7-4c21-4cd4-b8b1-074140a620c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.269335 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-55g75"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.270455 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.271410 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.272954 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-w8pxz"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.273643 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.275072 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.275141 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4sghb"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.275516 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tjsl5"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.276453 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6mhkh"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.277492 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.279527 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.280493 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.281579 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-krv4b"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.282466 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.284062 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2gkgd"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.285093 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bl29m"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.286124 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.286209 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.287083 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.288075 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.289773 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.290289 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-xbdmm"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.291309 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-84rbl"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.291403 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xbdmm" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.292113 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-td7r9"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.293099 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.294232 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4wvzk"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.294387 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.295247 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-n2vkg"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.296660 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pbz24"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.297951 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.298436 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.299433 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.300496 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bl29m"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.301772 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.302614 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8vr4f"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.303550 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5xxzs"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.304720 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5xxzs" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.305079 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5xxzs"] Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.324350 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.336345 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.356070 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359092 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359143 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/150f435e-dbbf-4106-bc06-046dd7abb405-signing-key\") pod \"service-ca-9c57cc56f-4wvzk\" (UID: \"150f435e-dbbf-4106-bc06-046dd7abb405\") " pod="openshift-service-ca/service-ca-9c57cc56f-4wvzk" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359175 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/50460f60-2828-42ab-94aa-3ae9d13a5a1e-etcd-client\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359202 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kjc5\" (UniqueName: \"kubernetes.io/projected/7d64e696-446f-4e5d-a276-4a9f18f291b2-kube-api-access-2kjc5\") pod \"machine-approver-56656f9798-l4t62\" (UID: \"7d64e696-446f-4e5d-a276-4a9f18f291b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359223 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c219812-f1dd-44da-9a23-764167668a0f-config\") pod \"console-operator-58897d9998-zjh85\" (UID: \"8c219812-f1dd-44da-9a23-764167668a0f\") " pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359245 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn9p7\" (UniqueName: \"kubernetes.io/projected/125e1f77-3db7-4893-8127-fcd74903a65b-kube-api-access-rn9p7\") pod \"catalog-operator-68c6474976-f5d99\" (UID: \"125e1f77-3db7-4893-8127-fcd74903a65b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359263 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7e20dd3-f239-419d-bc24-5e38d66e7803-audit-dir\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359280 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359301 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/505a2c86-f87d-4179-9156-7c6b98ba9b84-metrics-certs\") pod \"router-default-5444994796-ldzq4\" (UID: \"505a2c86-f87d-4179-9156-7c6b98ba9b84\") " pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pbz24\" (UID: \"baeafcba-9592-4136-9893-4bf3b9295041\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359338 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt8bf\" (UniqueName: \"kubernetes.io/projected/89cbcba8-bb80-4471-b57f-55cd045b68d9-kube-api-access-zt8bf\") pod \"openshift-controller-manager-operator-756b6f6bc6-phb4q\" (UID: \"89cbcba8-bb80-4471-b57f-55cd045b68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359356 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c219812-f1dd-44da-9a23-764167668a0f-serving-cert\") pod \"console-operator-58897d9998-zjh85\" (UID: \"8c219812-f1dd-44da-9a23-764167668a0f\") " pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359374 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50460f60-2828-42ab-94aa-3ae9d13a5a1e-config\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359390 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50460f60-2828-42ab-94aa-3ae9d13a5a1e-audit-dir\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359406 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-service-ca\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359425 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc5c20fa-6555-4502-8d1e-620e985c9607-serving-cert\") pod \"authentication-operator-69f744f599-gwtwj\" (UID: \"fc5c20fa-6555-4502-8d1e-620e985c9607\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359441 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/505a2c86-f87d-4179-9156-7c6b98ba9b84-service-ca-bundle\") pod \"router-default-5444994796-ldzq4\" (UID: \"505a2c86-f87d-4179-9156-7c6b98ba9b84\") " pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359455 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/125e1f77-3db7-4893-8127-fcd74903a65b-srv-cert\") pod \"catalog-operator-68c6474976-f5d99\" (UID: \"125e1f77-3db7-4893-8127-fcd74903a65b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359472 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv9h2\" (UniqueName: \"kubernetes.io/projected/3b2a5449-98a1-46dd-893d-7a7b8e5bf0de-kube-api-access-vv9h2\") pod \"kube-storage-version-migrator-operator-b67b599dd-76kq2\" (UID: \"3b2a5449-98a1-46dd-893d-7a7b8e5bf0de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359499 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2jlj\" (UniqueName: \"kubernetes.io/projected/fc5c20fa-6555-4502-8d1e-620e985c9607-kube-api-access-l2jlj\") pod \"authentication-operator-69f744f599-gwtwj\" (UID: \"fc5c20fa-6555-4502-8d1e-620e985c9607\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359513 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-oauth-config\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359527 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/50460f60-2828-42ab-94aa-3ae9d13a5a1e-image-import-ca\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359543 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50460f60-2828-42ab-94aa-3ae9d13a5a1e-serving-cert\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359559 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nghlb\" (UniqueName: \"kubernetes.io/projected/baeafcba-9592-4136-9893-4bf3b9295041-kube-api-access-nghlb\") pod \"marketplace-operator-79b997595-pbz24\" (UID: \"baeafcba-9592-4136-9893-4bf3b9295041\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359578 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d64e696-446f-4e5d-a276-4a9f18f291b2-config\") pod \"machine-approver-56656f9798-l4t62\" (UID: \"7d64e696-446f-4e5d-a276-4a9f18f291b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359620 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/150f435e-dbbf-4106-bc06-046dd7abb405-signing-cabundle\") pod \"service-ca-9c57cc56f-4wvzk\" (UID: \"150f435e-dbbf-4106-bc06-046dd7abb405\") " pod="openshift-service-ca/service-ca-9c57cc56f-4wvzk" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359637 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b2a5449-98a1-46dd-893d-7a7b8e5bf0de-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-76kq2\" (UID: \"3b2a5449-98a1-46dd-893d-7a7b8e5bf0de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/50460f60-2828-42ab-94aa-3ae9d13a5a1e-encryption-config\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359693 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2jst\" (UniqueName: \"kubernetes.io/projected/8c219812-f1dd-44da-9a23-764167668a0f-kube-api-access-b2jst\") pod \"console-operator-58897d9998-zjh85\" (UID: \"8c219812-f1dd-44da-9a23-764167668a0f\") " pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359875 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-etcd-client\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359927 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpf42\" (UniqueName: \"kubernetes.io/projected/a655856c-3900-4342-a094-dc03b84c8876-kube-api-access-hpf42\") pod \"machine-api-operator-5694c8668f-j9hhf\" (UID: \"a655856c-3900-4342-a094-dc03b84c8876\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359944 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc5c20fa-6555-4502-8d1e-620e985c9607-service-ca-bundle\") pod \"authentication-operator-69f744f599-gwtwj\" (UID: \"fc5c20fa-6555-4502-8d1e-620e985c9607\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359964 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.359982 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-audit-dir\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360002 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a655856c-3900-4342-a094-dc03b84c8876-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-j9hhf\" (UID: \"a655856c-3900-4342-a094-dc03b84c8876\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360018 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-config\") pod \"controller-manager-879f6c89f-5n9pg\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360039 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rgvm\" (UniqueName: \"kubernetes.io/projected/a231a0eb-9935-44d2-abf0-733aa2d944a6-kube-api-access-9rgvm\") pod \"packageserver-d55dfcdfc-4v47p\" (UID: \"a231a0eb-9935-44d2-abf0-733aa2d944a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360064 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c3569861-ce7b-4e88-a5ec-77a5ea7e995e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-krv4b\" (UID: \"c3569861-ce7b-4e88-a5ec-77a5ea7e995e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-krv4b" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360088 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/090b15f7-96ff-4c51-ac41-59d1ed0c66a7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dmmkv\" (UID: \"090b15f7-96ff-4c51-ac41-59d1ed0c66a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360112 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50460f60-2828-42ab-94aa-3ae9d13a5a1e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360136 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360155 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95c66716-8eaa-4e63-a30d-5b871fffd090-serving-cert\") pod \"openshift-config-operator-7777fb866f-hmdm5\" (UID: \"95c66716-8eaa-4e63-a30d-5b871fffd090\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360172 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-audit-policies\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360188 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360204 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360224 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkw6h\" (UniqueName: \"kubernetes.io/projected/954c73bd-5c98-4909-a01b-f28ca7be011e-kube-api-access-zkw6h\") pod \"package-server-manager-789f6589d5-r5h55\" (UID: \"954c73bd-5c98-4909-a01b-f28ca7be011e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360241 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360261 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56g7m\" (UniqueName: \"kubernetes.io/projected/505a2c86-f87d-4179-9156-7c6b98ba9b84-kube-api-access-56g7m\") pod \"router-default-5444994796-ldzq4\" (UID: \"505a2c86-f87d-4179-9156-7c6b98ba9b84\") " pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360278 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5593b1c-91e9-48c0-b348-cd0a46f64639-serving-cert\") pod \"route-controller-manager-6576b87f9c-w8xnw\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/090b15f7-96ff-4c51-ac41-59d1ed0c66a7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dmmkv\" (UID: \"090b15f7-96ff-4c51-ac41-59d1ed0c66a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360326 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fh4q\" (UniqueName: \"kubernetes.io/projected/e6f5cab6-00e5-44e0-8213-0754e21b2cb4-kube-api-access-2fh4q\") pod \"ingress-canary-2gkgd\" (UID: \"e6f5cab6-00e5-44e0-8213-0754e21b2cb4\") " pod="openshift-ingress-canary/ingress-canary-2gkgd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360343 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/505a2c86-f87d-4179-9156-7c6b98ba9b84-default-certificate\") pod \"router-default-5444994796-ldzq4\" (UID: \"505a2c86-f87d-4179-9156-7c6b98ba9b84\") " pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360359 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a231a0eb-9935-44d2-abf0-733aa2d944a6-tmpfs\") pod \"packageserver-d55dfcdfc-4v47p\" (UID: \"a231a0eb-9935-44d2-abf0-733aa2d944a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360375 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zbbq\" (UniqueName: \"kubernetes.io/projected/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-kube-api-access-4zbbq\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360392 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50460f60-2828-42ab-94aa-3ae9d13a5a1e-node-pullsecrets\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360410 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl2rh\" (UniqueName: \"kubernetes.io/projected/c5593b1c-91e9-48c0-b348-cd0a46f64639-kube-api-access-gl2rh\") pod \"route-controller-manager-6576b87f9c-w8xnw\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360427 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7d64e696-446f-4e5d-a276-4a9f18f291b2-machine-approver-tls\") pod \"machine-approver-56656f9798-l4t62\" (UID: \"7d64e696-446f-4e5d-a276-4a9f18f291b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360472 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-serving-cert\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360488 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/505a2c86-f87d-4179-9156-7c6b98ba9b84-stats-auth\") pod \"router-default-5444994796-ldzq4\" (UID: \"505a2c86-f87d-4179-9156-7c6b98ba9b84\") " pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360506 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6007cac9-0c7c-4d71-b65c-aa10735ecce4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qph9q\" (UID: \"6007cac9-0c7c-4d71-b65c-aa10735ecce4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qph9q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360522 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c219812-f1dd-44da-9a23-764167668a0f-trusted-ca\") pod \"console-operator-58897d9998-zjh85\" (UID: \"8c219812-f1dd-44da-9a23-764167668a0f\") " pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360542 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360565 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dct6\" (UniqueName: \"kubernetes.io/projected/a691b039-6ec5-4ba1-b64a-badfaaff730e-kube-api-access-4dct6\") pod \"migrator-59844c95c7-6mhkh\" (UID: \"a691b039-6ec5-4ba1-b64a-badfaaff730e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6mhkh" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360592 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxsn7\" (UniqueName: \"kubernetes.io/projected/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-kube-api-access-xxsn7\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360640 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a655856c-3900-4342-a094-dc03b84c8876-config\") pod \"machine-api-operator-5694c8668f-j9hhf\" (UID: \"a655856c-3900-4342-a094-dc03b84c8876\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360672 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/50460f60-2828-42ab-94aa-3ae9d13a5a1e-audit\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360691 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a231a0eb-9935-44d2-abf0-733aa2d944a6-apiservice-cert\") pod \"packageserver-d55dfcdfc-4v47p\" (UID: \"a231a0eb-9935-44d2-abf0-733aa2d944a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360709 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc5c20fa-6555-4502-8d1e-620e985c9607-config\") pod \"authentication-operator-69f744f599-gwtwj\" (UID: \"fc5c20fa-6555-4502-8d1e-620e985c9607\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360727 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/090b15f7-96ff-4c51-ac41-59d1ed0c66a7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dmmkv\" (UID: \"090b15f7-96ff-4c51-ac41-59d1ed0c66a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360745 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6f5cab6-00e5-44e0-8213-0754e21b2cb4-cert\") pod \"ingress-canary-2gkgd\" (UID: \"e6f5cab6-00e5-44e0-8213-0754e21b2cb4\") " pod="openshift-ingress-canary/ingress-canary-2gkgd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360763 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7cwf\" (UniqueName: \"kubernetes.io/projected/150f435e-dbbf-4106-bc06-046dd7abb405-kube-api-access-x7cwf\") pod \"service-ca-9c57cc56f-4wvzk\" (UID: \"150f435e-dbbf-4106-bc06-046dd7abb405\") " pod="openshift-service-ca/service-ca-9c57cc56f-4wvzk" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360781 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5dk5\" (UniqueName: \"kubernetes.io/projected/090b15f7-96ff-4c51-ac41-59d1ed0c66a7-kube-api-access-d5dk5\") pod \"cluster-image-registry-operator-dc59b4c8b-dmmkv\" (UID: \"090b15f7-96ff-4c51-ac41-59d1ed0c66a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360800 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdsr6\" (UniqueName: \"kubernetes.io/projected/6007cac9-0c7c-4d71-b65c-aa10735ecce4-kube-api-access-mdsr6\") pod \"cluster-samples-operator-665b6dd947-qph9q\" (UID: \"6007cac9-0c7c-4d71-b65c-aa10735ecce4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qph9q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360818 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5593b1c-91e9-48c0-b348-cd0a46f64639-config\") pod \"route-controller-manager-6576b87f9c-w8xnw\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360839 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89cbcba8-bb80-4471-b57f-55cd045b68d9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-phb4q\" (UID: \"89cbcba8-bb80-4471-b57f-55cd045b68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360839 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7e20dd3-f239-419d-bc24-5e38d66e7803-audit-dir\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360861 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360880 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-encryption-config\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360898 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e800807-1cef-4dcb-9001-48322127beb9-serving-cert\") pod \"controller-manager-879f6c89f-5n9pg\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360915 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360903 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50460f60-2828-42ab-94aa-3ae9d13a5a1e-node-pullsecrets\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.360933 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a231a0eb-9935-44d2-abf0-733aa2d944a6-webhook-cert\") pod \"packageserver-d55dfcdfc-4v47p\" (UID: \"a231a0eb-9935-44d2-abf0-733aa2d944a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.361919 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c219812-f1dd-44da-9a23-764167668a0f-config\") pod \"console-operator-58897d9998-zjh85\" (UID: \"8c219812-f1dd-44da-9a23-764167668a0f\") " pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.362362 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.362763 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc5c20fa-6555-4502-8d1e-620e985c9607-service-ca-bundle\") pod \"authentication-operator-69f744f599-gwtwj\" (UID: \"fc5c20fa-6555-4502-8d1e-620e985c9607\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.363540 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-config\") pod \"controller-manager-879f6c89f-5n9pg\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.363942 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc5c20fa-6555-4502-8d1e-620e985c9607-config\") pod \"authentication-operator-69f744f599-gwtwj\" (UID: \"fc5c20fa-6555-4502-8d1e-620e985c9607\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.364274 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.364391 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7d64e696-446f-4e5d-a276-4a9f18f291b2-machine-approver-tls\") pod \"machine-approver-56656f9798-l4t62\" (UID: \"7d64e696-446f-4e5d-a276-4a9f18f291b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.364502 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.364640 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.364984 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50460f60-2828-42ab-94aa-3ae9d13a5a1e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.365108 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d64e696-446f-4e5d-a276-4a9f18f291b2-config\") pod \"machine-approver-56656f9798-l4t62\" (UID: \"7d64e696-446f-4e5d-a276-4a9f18f291b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.365123 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.365661 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a655856c-3900-4342-a094-dc03b84c8876-config\") pod \"machine-api-operator-5694c8668f-j9hhf\" (UID: \"a655856c-3900-4342-a094-dc03b84c8876\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.365787 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-audit-policies\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.366980 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5593b1c-91e9-48c0-b348-cd0a46f64639-serving-cert\") pod \"route-controller-manager-6576b87f9c-w8xnw\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.367515 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7d64e696-446f-4e5d-a276-4a9f18f291b2-auth-proxy-config\") pod \"machine-approver-56656f9798-l4t62\" (UID: \"7d64e696-446f-4e5d-a276-4a9f18f291b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.367602 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svdqx\" (UniqueName: \"kubernetes.io/projected/2e800807-1cef-4dcb-9001-48322127beb9-kube-api-access-svdqx\") pod \"controller-manager-879f6c89f-5n9pg\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.367646 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.367685 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/125e1f77-3db7-4893-8127-fcd74903a65b-profile-collector-cert\") pod \"catalog-operator-68c6474976-f5d99\" (UID: \"125e1f77-3db7-4893-8127-fcd74903a65b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.367720 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-audit-policies\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.367748 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5n9pg\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.367776 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmj97\" (UniqueName: \"kubernetes.io/projected/2b4b5943-89e2-483d-a034-1344fec03f98-kube-api-access-fmj97\") pod \"downloads-7954f5f757-55g75\" (UID: \"2b4b5943-89e2-483d-a034-1344fec03f98\") " pod="openshift-console/downloads-7954f5f757-55g75" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.367803 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjgns\" (UniqueName: \"kubernetes.io/projected/95c66716-8eaa-4e63-a30d-5b871fffd090-kube-api-access-vjgns\") pod \"openshift-config-operator-7777fb866f-hmdm5\" (UID: \"95c66716-8eaa-4e63-a30d-5b871fffd090\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.367831 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pbz24\" (UID: \"baeafcba-9592-4136-9893-4bf3b9295041\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.367994 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc5c20fa-6555-4502-8d1e-620e985c9607-serving-cert\") pod \"authentication-operator-69f744f599-gwtwj\" (UID: \"fc5c20fa-6555-4502-8d1e-620e985c9607\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.368164 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c219812-f1dd-44da-9a23-764167668a0f-trusted-ca\") pod \"console-operator-58897d9998-zjh85\" (UID: \"8c219812-f1dd-44da-9a23-764167668a0f\") " pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.368370 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50460f60-2828-42ab-94aa-3ae9d13a5a1e-audit-dir\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.368475 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7d64e696-446f-4e5d-a276-4a9f18f291b2-auth-proxy-config\") pod \"machine-approver-56656f9798-l4t62\" (UID: \"7d64e696-446f-4e5d-a276-4a9f18f291b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.368700 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6007cac9-0c7c-4d71-b65c-aa10735ecce4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qph9q\" (UID: \"6007cac9-0c7c-4d71-b65c-aa10735ecce4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qph9q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.368754 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5593b1c-91e9-48c0-b348-cd0a46f64639-client-ca\") pod \"route-controller-manager-6576b87f9c-w8xnw\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.368865 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-serving-cert\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.368897 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/50460f60-2828-42ab-94aa-3ae9d13a5a1e-etcd-serving-ca\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.368920 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvz42\" (UniqueName: \"kubernetes.io/projected/50460f60-2828-42ab-94aa-3ae9d13a5a1e-kube-api-access-bvz42\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.368940 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/95c66716-8eaa-4e63-a30d-5b871fffd090-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hmdm5\" (UID: \"95c66716-8eaa-4e63-a30d-5b871fffd090\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.368964 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89cbcba8-bb80-4471-b57f-55cd045b68d9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-phb4q\" (UID: \"89cbcba8-bb80-4471-b57f-55cd045b68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.368992 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/954c73bd-5c98-4909-a01b-f28ca7be011e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r5h55\" (UID: \"954c73bd-5c98-4909-a01b-f28ca7be011e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.369013 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-config\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.369031 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzmr2\" (UniqueName: \"kubernetes.io/projected/b7e20dd3-f239-419d-bc24-5e38d66e7803-kube-api-access-wzmr2\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.369154 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-service-ca\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.369425 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-audit-dir\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.369576 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-etcd-client\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.369916 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc5c20fa-6555-4502-8d1e-620e985c9607-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gwtwj\" (UID: \"fc5c20fa-6555-4502-8d1e-620e985c9607\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.370039 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.369936 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.370221 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/50460f60-2828-42ab-94aa-3ae9d13a5a1e-image-import-ca\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.370359 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/090b15f7-96ff-4c51-ac41-59d1ed0c66a7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dmmkv\" (UID: \"090b15f7-96ff-4c51-ac41-59d1ed0c66a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.370362 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-oauth-serving-cert\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.370443 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b2a5449-98a1-46dd-893d-7a7b8e5bf0de-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-76kq2\" (UID: \"3b2a5449-98a1-46dd-893d-7a7b8e5bf0de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.370490 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9rxr\" (UniqueName: \"kubernetes.io/projected/c3569861-ce7b-4e88-a5ec-77a5ea7e995e-kube-api-access-w9rxr\") pod \"multus-admission-controller-857f4d67dd-krv4b\" (UID: \"c3569861-ce7b-4e88-a5ec-77a5ea7e995e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-krv4b" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.370551 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-client-ca\") pod \"controller-manager-879f6c89f-5n9pg\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.370648 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/95c66716-8eaa-4e63-a30d-5b871fffd090-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hmdm5\" (UID: \"95c66716-8eaa-4e63-a30d-5b871fffd090\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.370737 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-config\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.370807 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a655856c-3900-4342-a094-dc03b84c8876-images\") pod \"machine-api-operator-5694c8668f-j9hhf\" (UID: \"a655856c-3900-4342-a094-dc03b84c8876\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.371132 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-oauth-serving-cert\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.371182 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-trusted-ca-bundle\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.371582 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.371708 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89cbcba8-bb80-4471-b57f-55cd045b68d9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-phb4q\" (UID: \"89cbcba8-bb80-4471-b57f-55cd045b68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.371877 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a655856c-3900-4342-a094-dc03b84c8876-images\") pod \"machine-api-operator-5694c8668f-j9hhf\" (UID: \"a655856c-3900-4342-a094-dc03b84c8876\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.371960 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.371972 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5593b1c-91e9-48c0-b348-cd0a46f64639-client-ca\") pod \"route-controller-manager-6576b87f9c-w8xnw\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.372302 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/50460f60-2828-42ab-94aa-3ae9d13a5a1e-etcd-client\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.372636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc5c20fa-6555-4502-8d1e-620e985c9607-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gwtwj\" (UID: \"fc5c20fa-6555-4502-8d1e-620e985c9607\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.372746 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-audit-policies\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.372659 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5n9pg\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.372949 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-trusted-ca-bundle\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.373014 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.373077 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.373202 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95c66716-8eaa-4e63-a30d-5b871fffd090-serving-cert\") pod \"openshift-config-operator-7777fb866f-hmdm5\" (UID: \"95c66716-8eaa-4e63-a30d-5b871fffd090\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.373255 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e800807-1cef-4dcb-9001-48322127beb9-serving-cert\") pod \"controller-manager-879f6c89f-5n9pg\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.373510 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.373923 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-client-ca\") pod \"controller-manager-879f6c89f-5n9pg\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.373928 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a655856c-3900-4342-a094-dc03b84c8876-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-j9hhf\" (UID: \"a655856c-3900-4342-a094-dc03b84c8876\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.374075 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/090b15f7-96ff-4c51-ac41-59d1ed0c66a7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dmmkv\" (UID: \"090b15f7-96ff-4c51-ac41-59d1ed0c66a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.374077 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/50460f60-2828-42ab-94aa-3ae9d13a5a1e-encryption-config\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.374119 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/50460f60-2828-42ab-94aa-3ae9d13a5a1e-etcd-serving-ca\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.374151 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50460f60-2828-42ab-94aa-3ae9d13a5a1e-config\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.374474 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/50460f60-2828-42ab-94aa-3ae9d13a5a1e-audit\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.374914 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-oauth-config\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.374926 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-serving-cert\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.375027 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.375452 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5593b1c-91e9-48c0-b348-cd0a46f64639-config\") pod \"route-controller-manager-6576b87f9c-w8xnw\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.376566 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-serving-cert\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.377022 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.378308 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.378527 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50460f60-2828-42ab-94aa-3ae9d13a5a1e-serving-cert\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.378600 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.378648 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c219812-f1dd-44da-9a23-764167668a0f-serving-cert\") pod \"console-operator-58897d9998-zjh85\" (UID: \"8c219812-f1dd-44da-9a23-764167668a0f\") " pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.378887 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-encryption-config\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.380549 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89cbcba8-bb80-4471-b57f-55cd045b68d9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-phb4q\" (UID: \"89cbcba8-bb80-4471-b57f-55cd045b68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.394757 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.414804 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.436320 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.456574 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.473204 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a231a0eb-9935-44d2-abf0-733aa2d944a6-webhook-cert\") pod \"packageserver-d55dfcdfc-4v47p\" (UID: \"a231a0eb-9935-44d2-abf0-733aa2d944a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.473275 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/125e1f77-3db7-4893-8127-fcd74903a65b-profile-collector-cert\") pod \"catalog-operator-68c6474976-f5d99\" (UID: \"125e1f77-3db7-4893-8127-fcd74903a65b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.473323 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pbz24\" (UID: \"baeafcba-9592-4136-9893-4bf3b9295041\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.473358 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/954c73bd-5c98-4909-a01b-f28ca7be011e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r5h55\" (UID: \"954c73bd-5c98-4909-a01b-f28ca7be011e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.473392 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b2a5449-98a1-46dd-893d-7a7b8e5bf0de-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-76kq2\" (UID: \"3b2a5449-98a1-46dd-893d-7a7b8e5bf0de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.473422 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9rxr\" (UniqueName: \"kubernetes.io/projected/c3569861-ce7b-4e88-a5ec-77a5ea7e995e-kube-api-access-w9rxr\") pod \"multus-admission-controller-857f4d67dd-krv4b\" (UID: \"c3569861-ce7b-4e88-a5ec-77a5ea7e995e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-krv4b" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.473478 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/150f435e-dbbf-4106-bc06-046dd7abb405-signing-key\") pod \"service-ca-9c57cc56f-4wvzk\" (UID: \"150f435e-dbbf-4106-bc06-046dd7abb405\") " pod="openshift-service-ca/service-ca-9c57cc56f-4wvzk" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.473516 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn9p7\" (UniqueName: \"kubernetes.io/projected/125e1f77-3db7-4893-8127-fcd74903a65b-kube-api-access-rn9p7\") pod \"catalog-operator-68c6474976-f5d99\" (UID: \"125e1f77-3db7-4893-8127-fcd74903a65b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.473582 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/505a2c86-f87d-4179-9156-7c6b98ba9b84-metrics-certs\") pod \"router-default-5444994796-ldzq4\" (UID: \"505a2c86-f87d-4179-9156-7c6b98ba9b84\") " pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.473643 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pbz24\" (UID: \"baeafcba-9592-4136-9893-4bf3b9295041\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.473708 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/505a2c86-f87d-4179-9156-7c6b98ba9b84-service-ca-bundle\") pod \"router-default-5444994796-ldzq4\" (UID: \"505a2c86-f87d-4179-9156-7c6b98ba9b84\") " pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.473737 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/125e1f77-3db7-4893-8127-fcd74903a65b-srv-cert\") pod \"catalog-operator-68c6474976-f5d99\" (UID: \"125e1f77-3db7-4893-8127-fcd74903a65b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.473788 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv9h2\" (UniqueName: \"kubernetes.io/projected/3b2a5449-98a1-46dd-893d-7a7b8e5bf0de-kube-api-access-vv9h2\") pod \"kube-storage-version-migrator-operator-b67b599dd-76kq2\" (UID: \"3b2a5449-98a1-46dd-893d-7a7b8e5bf0de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.474053 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nghlb\" (UniqueName: \"kubernetes.io/projected/baeafcba-9592-4136-9893-4bf3b9295041-kube-api-access-nghlb\") pod \"marketplace-operator-79b997595-pbz24\" (UID: \"baeafcba-9592-4136-9893-4bf3b9295041\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.474165 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/150f435e-dbbf-4106-bc06-046dd7abb405-signing-cabundle\") pod \"service-ca-9c57cc56f-4wvzk\" (UID: \"150f435e-dbbf-4106-bc06-046dd7abb405\") " pod="openshift-service-ca/service-ca-9c57cc56f-4wvzk" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.474195 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b2a5449-98a1-46dd-893d-7a7b8e5bf0de-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-76kq2\" (UID: \"3b2a5449-98a1-46dd-893d-7a7b8e5bf0de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.474320 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rgvm\" (UniqueName: \"kubernetes.io/projected/a231a0eb-9935-44d2-abf0-733aa2d944a6-kube-api-access-9rgvm\") pod \"packageserver-d55dfcdfc-4v47p\" (UID: \"a231a0eb-9935-44d2-abf0-733aa2d944a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.474361 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c3569861-ce7b-4e88-a5ec-77a5ea7e995e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-krv4b\" (UID: \"c3569861-ce7b-4e88-a5ec-77a5ea7e995e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-krv4b" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.474401 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkw6h\" (UniqueName: \"kubernetes.io/projected/954c73bd-5c98-4909-a01b-f28ca7be011e-kube-api-access-zkw6h\") pod \"package-server-manager-789f6589d5-r5h55\" (UID: \"954c73bd-5c98-4909-a01b-f28ca7be011e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.474436 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56g7m\" (UniqueName: \"kubernetes.io/projected/505a2c86-f87d-4179-9156-7c6b98ba9b84-kube-api-access-56g7m\") pod \"router-default-5444994796-ldzq4\" (UID: \"505a2c86-f87d-4179-9156-7c6b98ba9b84\") " pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.474467 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fh4q\" (UniqueName: \"kubernetes.io/projected/e6f5cab6-00e5-44e0-8213-0754e21b2cb4-kube-api-access-2fh4q\") pod \"ingress-canary-2gkgd\" (UID: \"e6f5cab6-00e5-44e0-8213-0754e21b2cb4\") " pod="openshift-ingress-canary/ingress-canary-2gkgd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.474526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/505a2c86-f87d-4179-9156-7c6b98ba9b84-default-certificate\") pod \"router-default-5444994796-ldzq4\" (UID: \"505a2c86-f87d-4179-9156-7c6b98ba9b84\") " pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.474639 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a231a0eb-9935-44d2-abf0-733aa2d944a6-tmpfs\") pod \"packageserver-d55dfcdfc-4v47p\" (UID: \"a231a0eb-9935-44d2-abf0-733aa2d944a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.474775 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/505a2c86-f87d-4179-9156-7c6b98ba9b84-stats-auth\") pod \"router-default-5444994796-ldzq4\" (UID: \"505a2c86-f87d-4179-9156-7c6b98ba9b84\") " pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.474835 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dct6\" (UniqueName: \"kubernetes.io/projected/a691b039-6ec5-4ba1-b64a-badfaaff730e-kube-api-access-4dct6\") pod \"migrator-59844c95c7-6mhkh\" (UID: \"a691b039-6ec5-4ba1-b64a-badfaaff730e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6mhkh" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.475045 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a231a0eb-9935-44d2-abf0-733aa2d944a6-apiservice-cert\") pod \"packageserver-d55dfcdfc-4v47p\" (UID: \"a231a0eb-9935-44d2-abf0-733aa2d944a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.475076 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6f5cab6-00e5-44e0-8213-0754e21b2cb4-cert\") pod \"ingress-canary-2gkgd\" (UID: \"e6f5cab6-00e5-44e0-8213-0754e21b2cb4\") " pod="openshift-ingress-canary/ingress-canary-2gkgd" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.475100 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7cwf\" (UniqueName: \"kubernetes.io/projected/150f435e-dbbf-4106-bc06-046dd7abb405-kube-api-access-x7cwf\") pod \"service-ca-9c57cc56f-4wvzk\" (UID: \"150f435e-dbbf-4106-bc06-046dd7abb405\") " pod="openshift-service-ca/service-ca-9c57cc56f-4wvzk" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.475151 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b2a5449-98a1-46dd-893d-7a7b8e5bf0de-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-76kq2\" (UID: \"3b2a5449-98a1-46dd-893d-7a7b8e5bf0de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.475207 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a231a0eb-9935-44d2-abf0-733aa2d944a6-tmpfs\") pod \"packageserver-d55dfcdfc-4v47p\" (UID: \"a231a0eb-9935-44d2-abf0-733aa2d944a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.475669 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.477527 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c3569861-ce7b-4e88-a5ec-77a5ea7e995e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-krv4b\" (UID: \"c3569861-ce7b-4e88-a5ec-77a5ea7e995e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-krv4b" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.477883 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b2a5449-98a1-46dd-893d-7a7b8e5bf0de-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-76kq2\" (UID: \"3b2a5449-98a1-46dd-893d-7a7b8e5bf0de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.494730 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.515063 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.535741 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.556972 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.574497 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.596137 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.615950 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.635367 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.655737 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.675484 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.694257 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.735927 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.756698 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.775863 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.795717 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.809115 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/505a2c86-f87d-4179-9156-7c6b98ba9b84-default-certificate\") pod \"router-default-5444994796-ldzq4\" (UID: \"505a2c86-f87d-4179-9156-7c6b98ba9b84\") " pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.815729 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.829411 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/505a2c86-f87d-4179-9156-7c6b98ba9b84-stats-auth\") pod \"router-default-5444994796-ldzq4\" (UID: \"505a2c86-f87d-4179-9156-7c6b98ba9b84\") " pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.836173 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.847965 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/505a2c86-f87d-4179-9156-7c6b98ba9b84-metrics-certs\") pod \"router-default-5444994796-ldzq4\" (UID: \"505a2c86-f87d-4179-9156-7c6b98ba9b84\") " pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.855793 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.865041 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/505a2c86-f87d-4179-9156-7c6b98ba9b84-service-ca-bundle\") pod \"router-default-5444994796-ldzq4\" (UID: \"505a2c86-f87d-4179-9156-7c6b98ba9b84\") " pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.874380 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.895793 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.916985 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.936909 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.956032 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.976763 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 25 16:01:00 crc kubenswrapper[4743]: I1125 16:01:00.996146 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.009684 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/954c73bd-5c98-4909-a01b-f28ca7be011e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r5h55\" (UID: \"954c73bd-5c98-4909-a01b-f28ca7be011e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.015742 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.036583 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.056511 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.076014 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.096830 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.115067 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.135156 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.155841 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.177544 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.192951 4743 request.go:700] Waited for 1.009032377s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpprof-cert&limit=500&resourceVersion=0 Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.195942 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.208424 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/125e1f77-3db7-4893-8127-fcd74903a65b-profile-collector-cert\") pod \"catalog-operator-68c6474976-f5d99\" (UID: \"125e1f77-3db7-4893-8127-fcd74903a65b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.216805 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.235470 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.248137 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a231a0eb-9935-44d2-abf0-733aa2d944a6-webhook-cert\") pod \"packageserver-d55dfcdfc-4v47p\" (UID: \"a231a0eb-9935-44d2-abf0-733aa2d944a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.249586 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a231a0eb-9935-44d2-abf0-733aa2d944a6-apiservice-cert\") pod \"packageserver-d55dfcdfc-4v47p\" (UID: \"a231a0eb-9935-44d2-abf0-733aa2d944a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.257090 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.275884 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.296313 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.316028 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.328670 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/150f435e-dbbf-4106-bc06-046dd7abb405-signing-key\") pod \"service-ca-9c57cc56f-4wvzk\" (UID: \"150f435e-dbbf-4106-bc06-046dd7abb405\") " pod="openshift-service-ca/service-ca-9c57cc56f-4wvzk" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.336687 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.345497 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/150f435e-dbbf-4106-bc06-046dd7abb405-signing-cabundle\") pod \"service-ca-9c57cc56f-4wvzk\" (UID: \"150f435e-dbbf-4106-bc06-046dd7abb405\") " pod="openshift-service-ca/service-ca-9c57cc56f-4wvzk" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.355329 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.376207 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.396542 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.416317 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.436132 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.456143 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: E1125 16:01:01.473734 4743 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Nov 25 16:01:01 crc kubenswrapper[4743]: E1125 16:01:01.473856 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-trusted-ca podName:baeafcba-9592-4136-9893-4bf3b9295041 nodeName:}" failed. No retries permitted until 2025-11-25 16:01:01.973830304 +0000 UTC m=+141.095669853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-trusted-ca") pod "marketplace-operator-79b997595-pbz24" (UID: "baeafcba-9592-4136-9893-4bf3b9295041") : failed to sync configmap cache: timed out waiting for the condition Nov 25 16:01:01 crc kubenswrapper[4743]: E1125 16:01:01.474073 4743 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 25 16:01:01 crc kubenswrapper[4743]: E1125 16:01:01.474097 4743 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Nov 25 16:01:01 crc kubenswrapper[4743]: E1125 16:01:01.474166 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-operator-metrics podName:baeafcba-9592-4136-9893-4bf3b9295041 nodeName:}" failed. No retries permitted until 2025-11-25 16:01:01.974150353 +0000 UTC m=+141.095989902 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-operator-metrics") pod "marketplace-operator-79b997595-pbz24" (UID: "baeafcba-9592-4136-9893-4bf3b9295041") : failed to sync secret cache: timed out waiting for the condition Nov 25 16:01:01 crc kubenswrapper[4743]: E1125 16:01:01.474191 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/125e1f77-3db7-4893-8127-fcd74903a65b-srv-cert podName:125e1f77-3db7-4893-8127-fcd74903a65b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:01.974181374 +0000 UTC m=+141.096020923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/125e1f77-3db7-4893-8127-fcd74903a65b-srv-cert") pod "catalog-operator-68c6474976-f5d99" (UID: "125e1f77-3db7-4893-8127-fcd74903a65b") : failed to sync secret cache: timed out waiting for the condition Nov 25 16:01:01 crc kubenswrapper[4743]: E1125 16:01:01.476101 4743 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 25 16:01:01 crc kubenswrapper[4743]: E1125 16:01:01.476229 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6f5cab6-00e5-44e0-8213-0754e21b2cb4-cert podName:e6f5cab6-00e5-44e0-8213-0754e21b2cb4 nodeName:}" failed. No retries permitted until 2025-11-25 16:01:01.976197741 +0000 UTC m=+141.098037450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e6f5cab6-00e5-44e0-8213-0754e21b2cb4-cert") pod "ingress-canary-2gkgd" (UID: "e6f5cab6-00e5-44e0-8213-0754e21b2cb4") : failed to sync secret cache: timed out waiting for the condition Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.476236 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.495882 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.516172 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.535972 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.556494 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.576008 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.596170 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.615628 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.635968 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.655518 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.676441 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.696244 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.715655 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.736554 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.755212 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.776341 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.804666 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.816118 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.834441 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.855535 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.875861 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.895389 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.932076 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jxd8\" (UniqueName: \"kubernetes.io/projected/766093a7-4c21-4cd4-b8b1-074140a620c9-kube-api-access-2jxd8\") pod \"openshift-apiserver-operator-796bbdcf4f-rdvpf\" (UID: \"766093a7-4c21-4cd4-b8b1-074140a620c9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.956693 4743 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.964735 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.975706 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.996236 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.999393 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pbz24\" (UID: \"baeafcba-9592-4136-9893-4bf3b9295041\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.999437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/125e1f77-3db7-4893-8127-fcd74903a65b-srv-cert\") pod \"catalog-operator-68c6474976-f5d99\" (UID: \"125e1f77-3db7-4893-8127-fcd74903a65b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.999586 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6f5cab6-00e5-44e0-8213-0754e21b2cb4-cert\") pod \"ingress-canary-2gkgd\" (UID: \"e6f5cab6-00e5-44e0-8213-0754e21b2cb4\") " pod="openshift-ingress-canary/ingress-canary-2gkgd" Nov 25 16:01:01 crc kubenswrapper[4743]: I1125 16:01:01.999690 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pbz24\" (UID: \"baeafcba-9592-4136-9893-4bf3b9295041\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.001346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pbz24\" (UID: \"baeafcba-9592-4136-9893-4bf3b9295041\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.004013 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/125e1f77-3db7-4893-8127-fcd74903a65b-srv-cert\") pod \"catalog-operator-68c6474976-f5d99\" (UID: \"125e1f77-3db7-4893-8127-fcd74903a65b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.005258 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pbz24\" (UID: \"baeafcba-9592-4136-9893-4bf3b9295041\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.005287 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6f5cab6-00e5-44e0-8213-0754e21b2cb4-cert\") pod \"ingress-canary-2gkgd\" (UID: \"e6f5cab6-00e5-44e0-8213-0754e21b2cb4\") " pod="openshift-ingress-canary/ingress-canary-2gkgd" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.015771 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.035154 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.055899 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.075552 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.096457 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.116228 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.154703 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kjc5\" (UniqueName: \"kubernetes.io/projected/7d64e696-446f-4e5d-a276-4a9f18f291b2-kube-api-access-2kjc5\") pod \"machine-approver-56656f9798-l4t62\" (UID: \"7d64e696-446f-4e5d-a276-4a9f18f291b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.169270 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/090b15f7-96ff-4c51-ac41-59d1ed0c66a7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dmmkv\" (UID: \"090b15f7-96ff-4c51-ac41-59d1ed0c66a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.190914 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zbbq\" (UniqueName: \"kubernetes.io/projected/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-kube-api-access-4zbbq\") pod \"console-f9d7485db-4sghb\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.193509 4743 request.go:700] Waited for 1.829949128s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.212399 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl2rh\" (UniqueName: \"kubernetes.io/projected/c5593b1c-91e9-48c0-b348-cd0a46f64639-kube-api-access-gl2rh\") pod \"route-controller-manager-6576b87f9c-w8xnw\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.232560 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt8bf\" (UniqueName: \"kubernetes.io/projected/89cbcba8-bb80-4471-b57f-55cd045b68d9-kube-api-access-zt8bf\") pod \"openshift-controller-manager-operator-756b6f6bc6-phb4q\" (UID: \"89cbcba8-bb80-4471-b57f-55cd045b68d9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.249452 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5dk5\" (UniqueName: \"kubernetes.io/projected/090b15f7-96ff-4c51-ac41-59d1ed0c66a7-kube-api-access-d5dk5\") pod \"cluster-image-registry-operator-dc59b4c8b-dmmkv\" (UID: \"090b15f7-96ff-4c51-ac41-59d1ed0c66a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.279136 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdsr6\" (UniqueName: \"kubernetes.io/projected/6007cac9-0c7c-4d71-b65c-aa10735ecce4-kube-api-access-mdsr6\") pod \"cluster-samples-operator-665b6dd947-qph9q\" (UID: \"6007cac9-0c7c-4d71-b65c-aa10735ecce4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qph9q" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.290012 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2jlj\" (UniqueName: \"kubernetes.io/projected/fc5c20fa-6555-4502-8d1e-620e985c9607-kube-api-access-l2jlj\") pod \"authentication-operator-69f744f599-gwtwj\" (UID: \"fc5c20fa-6555-4502-8d1e-620e985c9607\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.312302 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxsn7\" (UniqueName: \"kubernetes.io/projected/1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb-kube-api-access-xxsn7\") pod \"apiserver-7bbb656c7d-t6sfd\" (UID: \"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.324577 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.332840 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmj97\" (UniqueName: \"kubernetes.io/projected/2b4b5943-89e2-483d-a034-1344fec03f98-kube-api-access-fmj97\") pod \"downloads-7954f5f757-55g75\" (UID: \"2b4b5943-89e2-483d-a034-1344fec03f98\") " pod="openshift-console/downloads-7954f5f757-55g75" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.335209 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.346500 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qph9q" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.351434 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjgns\" (UniqueName: \"kubernetes.io/projected/95c66716-8eaa-4e63-a30d-5b871fffd090-kube-api-access-vjgns\") pod \"openshift-config-operator-7777fb866f-hmdm5\" (UID: \"95c66716-8eaa-4e63-a30d-5b871fffd090\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.360184 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-55g75" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.376967 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpf42\" (UniqueName: \"kubernetes.io/projected/a655856c-3900-4342-a094-dc03b84c8876-kube-api-access-hpf42\") pod \"machine-api-operator-5694c8668f-j9hhf\" (UID: \"a655856c-3900-4342-a094-dc03b84c8876\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.386476 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.387035 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.388054 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf"] Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.390144 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2jst\" (UniqueName: \"kubernetes.io/projected/8c219812-f1dd-44da-9a23-764167668a0f-kube-api-access-b2jst\") pod \"console-operator-58897d9998-zjh85\" (UID: \"8c219812-f1dd-44da-9a23-764167668a0f\") " pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.393973 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:02 crc kubenswrapper[4743]: W1125 16:01:02.398457 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod766093a7_4c21_4cd4_b8b1_074140a620c9.slice/crio-ad78ede10b4578abd0dbd0b547ce1a7d971a466e870fd554f103dbacdab2574e WatchSource:0}: Error finding container ad78ede10b4578abd0dbd0b547ce1a7d971a466e870fd554f103dbacdab2574e: Status 404 returned error can't find the container with id ad78ede10b4578abd0dbd0b547ce1a7d971a466e870fd554f103dbacdab2574e Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.411697 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.415835 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svdqx\" (UniqueName: \"kubernetes.io/projected/2e800807-1cef-4dcb-9001-48322127beb9-kube-api-access-svdqx\") pod \"controller-manager-879f6c89f-5n9pg\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.434222 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvz42\" (UniqueName: \"kubernetes.io/projected/50460f60-2828-42ab-94aa-3ae9d13a5a1e-kube-api-access-bvz42\") pod \"apiserver-76f77b778f-tjsl5\" (UID: \"50460f60-2828-42ab-94aa-3ae9d13a5a1e\") " pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.471537 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.475412 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzmr2\" (UniqueName: \"kubernetes.io/projected/b7e20dd3-f239-419d-bc24-5e38d66e7803-kube-api-access-wzmr2\") pod \"oauth-openshift-558db77b4-w8pxz\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.477681 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9rxr\" (UniqueName: \"kubernetes.io/projected/c3569861-ce7b-4e88-a5ec-77a5ea7e995e-kube-api-access-w9rxr\") pod \"multus-admission-controller-857f4d67dd-krv4b\" (UID: \"c3569861-ce7b-4e88-a5ec-77a5ea7e995e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-krv4b" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.491422 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" event={"ID":"7d64e696-446f-4e5d-a276-4a9f18f291b2","Type":"ContainerStarted","Data":"c83be7dc75e60fc90ab4a2e5e6144ddf649db6b887c208f501854d58613ffcae"} Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.493795 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.494538 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf" event={"ID":"766093a7-4c21-4cd4-b8b1-074140a620c9","Type":"ContainerStarted","Data":"ad78ede10b4578abd0dbd0b547ce1a7d971a466e870fd554f103dbacdab2574e"} Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.496371 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn9p7\" (UniqueName: \"kubernetes.io/projected/125e1f77-3db7-4893-8127-fcd74903a65b-kube-api-access-rn9p7\") pod \"catalog-operator-68c6474976-f5d99\" (UID: \"125e1f77-3db7-4893-8127-fcd74903a65b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.511206 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.513239 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv9h2\" (UniqueName: \"kubernetes.io/projected/3b2a5449-98a1-46dd-893d-7a7b8e5bf0de-kube-api-access-vv9h2\") pod \"kube-storage-version-migrator-operator-b67b599dd-76kq2\" (UID: \"3b2a5449-98a1-46dd-893d-7a7b8e5bf0de\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.545227 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nghlb\" (UniqueName: \"kubernetes.io/projected/baeafcba-9592-4136-9893-4bf3b9295041-kube-api-access-nghlb\") pod \"marketplace-operator-79b997595-pbz24\" (UID: \"baeafcba-9592-4136-9893-4bf3b9295041\") " pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.557084 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw"] Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.564217 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56g7m\" (UniqueName: \"kubernetes.io/projected/505a2c86-f87d-4179-9156-7c6b98ba9b84-kube-api-access-56g7m\") pod \"router-default-5444994796-ldzq4\" (UID: \"505a2c86-f87d-4179-9156-7c6b98ba9b84\") " pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.570606 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.574265 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fh4q\" (UniqueName: \"kubernetes.io/projected/e6f5cab6-00e5-44e0-8213-0754e21b2cb4-kube-api-access-2fh4q\") pod \"ingress-canary-2gkgd\" (UID: \"e6f5cab6-00e5-44e0-8213-0754e21b2cb4\") " pod="openshift-ingress-canary/ingress-canary-2gkgd" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.595167 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rgvm\" (UniqueName: \"kubernetes.io/projected/a231a0eb-9935-44d2-abf0-733aa2d944a6-kube-api-access-9rgvm\") pod \"packageserver-d55dfcdfc-4v47p\" (UID: \"a231a0eb-9935-44d2-abf0-733aa2d944a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.596889 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.614493 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkw6h\" (UniqueName: \"kubernetes.io/projected/954c73bd-5c98-4909-a01b-f28ca7be011e-kube-api-access-zkw6h\") pod \"package-server-manager-789f6589d5-r5h55\" (UID: \"954c73bd-5c98-4909-a01b-f28ca7be011e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.615126 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.632187 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dct6\" (UniqueName: \"kubernetes.io/projected/a691b039-6ec5-4ba1-b64a-badfaaff730e-kube-api-access-4dct6\") pod \"migrator-59844c95c7-6mhkh\" (UID: \"a691b039-6ec5-4ba1-b64a-badfaaff730e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6mhkh" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.654092 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7cwf\" (UniqueName: \"kubernetes.io/projected/150f435e-dbbf-4106-bc06-046dd7abb405-kube-api-access-x7cwf\") pod \"service-ca-9c57cc56f-4wvzk\" (UID: \"150f435e-dbbf-4106-bc06-046dd7abb405\") " pod="openshift-service-ca/service-ca-9c57cc56f-4wvzk" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.672353 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.677957 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-55g75"] Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.680965 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.688146 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2gkgd" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719236 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p4fr\" (UniqueName: \"kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-kube-api-access-9p4fr\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719282 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a6cf416d-49d9-4e74-846b-7d6923ae415f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cbr2m\" (UID: \"a6cf416d-49d9-4e74-846b-7d6923ae415f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719311 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-registry-tls\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719336 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-bound-sa-token\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719356 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cffdcaea-9ca5-44de-bb0b-83d3c8d1da60-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lqk6q\" (UID: \"cffdcaea-9ca5-44de-bb0b-83d3c8d1da60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719390 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a6cf416d-49d9-4e74-846b-7d6923ae415f-srv-cert\") pod \"olm-operator-6b444d44fb-cbr2m\" (UID: \"a6cf416d-49d9-4e74-846b-7d6923ae415f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719412 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01b5ca28-1828-40b6-97cf-093f8027dab3-config-volume\") pod \"collect-profiles-29401440-92l65\" (UID: \"01b5ca28-1828-40b6-97cf-093f8027dab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719501 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b022022e-00fd-43df-99b2-eb91f39e4264-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4dtjj\" (UID: \"b022022e-00fd-43df-99b2-eb91f39e4264\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w74fr\" (UniqueName: \"kubernetes.io/projected/8421cc57-3b6e-44b6-a732-75cce2177e80-kube-api-access-w74fr\") pod \"dns-operator-744455d44c-84rbl\" (UID: \"8421cc57-3b6e-44b6-a732-75cce2177e80\") " pod="openshift-dns-operator/dns-operator-744455d44c-84rbl" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719667 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/458c2ebd-ea67-4efc-b058-142de4fce612-installation-pull-secrets\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719698 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c99ef725-44cc-4a3f-9ee1-2df041f7a254-serving-cert\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719725 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/458c2ebd-ea67-4efc-b058-142de4fce612-ca-trust-extracted\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719756 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719809 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7b1cd86-dbe1-4674-9159-2e9dbcc6182c-proxy-tls\") pod \"machine-config-controller-84d6567774-2h6h9\" (UID: \"e7b1cd86-dbe1-4674-9159-2e9dbcc6182c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719837 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c99ef725-44cc-4a3f-9ee1-2df041f7a254-etcd-service-ca\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719862 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48fh5\" (UniqueName: \"kubernetes.io/projected/a6cf416d-49d9-4e74-846b-7d6923ae415f-kube-api-access-48fh5\") pod \"olm-operator-6b444d44fb-cbr2m\" (UID: \"a6cf416d-49d9-4e74-846b-7d6923ae415f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719883 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-swbj6\" (UID: \"bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719904 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cffdcaea-9ca5-44de-bb0b-83d3c8d1da60-proxy-tls\") pod \"machine-config-operator-74547568cd-lqk6q\" (UID: \"cffdcaea-9ca5-44de-bb0b-83d3c8d1da60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719927 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2jlb\" (UniqueName: \"kubernetes.io/projected/cffdcaea-9ca5-44de-bb0b-83d3c8d1da60-kube-api-access-c2jlb\") pod \"machine-config-operator-74547568cd-lqk6q\" (UID: \"cffdcaea-9ca5-44de-bb0b-83d3c8d1da60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.719989 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhn7q\" (UniqueName: \"kubernetes.io/projected/fb60121d-df03-4f88-a9e5-118105c6ce94-kube-api-access-vhn7q\") pod \"control-plane-machine-set-operator-78cbb6b69f-8vr4f\" (UID: \"fb60121d-df03-4f88-a9e5-118105c6ce94\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8vr4f" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.720051 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2185f4f-1936-4335-9aab-67538f5e3888-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-58mwb\" (UID: \"d2185f4f-1936-4335-9aab-67538f5e3888\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.720143 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687-config\") pod \"kube-apiserver-operator-766d6c64bb-swbj6\" (UID: \"bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.720171 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01b5ca28-1828-40b6-97cf-093f8027dab3-secret-volume\") pod \"collect-profiles-29401440-92l65\" (UID: \"01b5ca28-1828-40b6-97cf-093f8027dab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.720195 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90a94872-5a77-4279-a045-55a6abe7a781-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nqrqh\" (UID: \"90a94872-5a77-4279-a045-55a6abe7a781\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.720219 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwb2p\" (UniqueName: \"kubernetes.io/projected/f83d7b2d-d239-4476-b094-929bce4f8b20-kube-api-access-wwb2p\") pod \"service-ca-operator-777779d784-7j4pk\" (UID: \"f83d7b2d-d239-4476-b094-929bce4f8b20\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.720242 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c99ef725-44cc-4a3f-9ee1-2df041f7a254-etcd-client\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: E1125 16:01:02.720821 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:03.220805581 +0000 UTC m=+142.342645220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.721026 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2185f4f-1936-4335-9aab-67538f5e3888-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-58mwb\" (UID: \"d2185f4f-1936-4335-9aab-67538f5e3888\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.721100 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b022022e-00fd-43df-99b2-eb91f39e4264-metrics-tls\") pod \"ingress-operator-5b745b69d9-4dtjj\" (UID: \"b022022e-00fd-43df-99b2-eb91f39e4264\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.721139 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cffdcaea-9ca5-44de-bb0b-83d3c8d1da60-images\") pod \"machine-config-operator-74547568cd-lqk6q\" (UID: \"cffdcaea-9ca5-44de-bb0b-83d3c8d1da60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.721188 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdfgh\" (UniqueName: \"kubernetes.io/projected/01b5ca28-1828-40b6-97cf-093f8027dab3-kube-api-access-hdfgh\") pod \"collect-profiles-29401440-92l65\" (UID: \"01b5ca28-1828-40b6-97cf-093f8027dab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.721220 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f83d7b2d-d239-4476-b094-929bce4f8b20-serving-cert\") pod \"service-ca-operator-777779d784-7j4pk\" (UID: \"f83d7b2d-d239-4476-b094-929bce4f8b20\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.721238 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8421cc57-3b6e-44b6-a732-75cce2177e80-metrics-tls\") pod \"dns-operator-744455d44c-84rbl\" (UID: \"8421cc57-3b6e-44b6-a732-75cce2177e80\") " pod="openshift-dns-operator/dns-operator-744455d44c-84rbl" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.721732 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b022022e-00fd-43df-99b2-eb91f39e4264-trusted-ca\") pod \"ingress-operator-5b745b69d9-4dtjj\" (UID: \"b022022e-00fd-43df-99b2-eb91f39e4264\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.723129 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c99ef725-44cc-4a3f-9ee1-2df041f7a254-etcd-ca\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.723170 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hpbg\" (UniqueName: \"kubernetes.io/projected/c99ef725-44cc-4a3f-9ee1-2df041f7a254-kube-api-access-6hpbg\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.723262 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx72r\" (UniqueName: \"kubernetes.io/projected/b022022e-00fd-43df-99b2-eb91f39e4264-kube-api-access-tx72r\") pod \"ingress-operator-5b745b69d9-4dtjj\" (UID: \"b022022e-00fd-43df-99b2-eb91f39e4264\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.725454 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.725702 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c99ef725-44cc-4a3f-9ee1-2df041f7a254-config\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.725766 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e7b1cd86-dbe1-4674-9159-2e9dbcc6182c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2h6h9\" (UID: \"e7b1cd86-dbe1-4674-9159-2e9dbcc6182c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.725788 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f83d7b2d-d239-4476-b094-929bce4f8b20-config\") pod \"service-ca-operator-777779d784-7j4pk\" (UID: \"f83d7b2d-d239-4476-b094-929bce4f8b20\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.725807 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90a94872-5a77-4279-a045-55a6abe7a781-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nqrqh\" (UID: \"90a94872-5a77-4279-a045-55a6abe7a781\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.725846 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hvz7\" (UniqueName: \"kubernetes.io/projected/e7b1cd86-dbe1-4674-9159-2e9dbcc6182c-kube-api-access-9hvz7\") pod \"machine-config-controller-84d6567774-2h6h9\" (UID: \"e7b1cd86-dbe1-4674-9159-2e9dbcc6182c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.725875 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2185f4f-1936-4335-9aab-67538f5e3888-config\") pod \"kube-controller-manager-operator-78b949d7b-58mwb\" (UID: \"d2185f4f-1936-4335-9aab-67538f5e3888\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.725917 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb60121d-df03-4f88-a9e5-118105c6ce94-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8vr4f\" (UID: \"fb60121d-df03-4f88-a9e5-118105c6ce94\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8vr4f" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.725945 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/458c2ebd-ea67-4efc-b058-142de4fce612-registry-certificates\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.726016 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-swbj6\" (UID: \"bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.726036 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90a94872-5a77-4279-a045-55a6abe7a781-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nqrqh\" (UID: \"90a94872-5a77-4279-a045-55a6abe7a781\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.726087 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/458c2ebd-ea67-4efc-b058-142de4fce612-trusted-ca\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.739253 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.748577 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-krv4b" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.818183 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qph9q"] Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827278 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827453 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/63c6e7d6-50c6-4a7e-b318-edbfa0496006-socket-dir\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827565 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b022022e-00fd-43df-99b2-eb91f39e4264-trusted-ca\") pod \"ingress-operator-5b745b69d9-4dtjj\" (UID: \"b022022e-00fd-43df-99b2-eb91f39e4264\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827666 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snwdg\" (UniqueName: \"kubernetes.io/projected/e411e757-a7ba-45fe-b248-8eca5d468049-kube-api-access-snwdg\") pod \"machine-config-server-xbdmm\" (UID: \"e411e757-a7ba-45fe-b248-8eca5d468049\") " pod="openshift-machine-config-operator/machine-config-server-xbdmm" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827717 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c99ef725-44cc-4a3f-9ee1-2df041f7a254-etcd-ca\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827738 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xzwv\" (UniqueName: \"kubernetes.io/projected/63c6e7d6-50c6-4a7e-b318-edbfa0496006-kube-api-access-2xzwv\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827756 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hpbg\" (UniqueName: \"kubernetes.io/projected/c99ef725-44cc-4a3f-9ee1-2df041f7a254-kube-api-access-6hpbg\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827772 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx72r\" (UniqueName: \"kubernetes.io/projected/b022022e-00fd-43df-99b2-eb91f39e4264-kube-api-access-tx72r\") pod \"ingress-operator-5b745b69d9-4dtjj\" (UID: \"b022022e-00fd-43df-99b2-eb91f39e4264\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827810 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c99ef725-44cc-4a3f-9ee1-2df041f7a254-config\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827827 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e7b1cd86-dbe1-4674-9159-2e9dbcc6182c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2h6h9\" (UID: \"e7b1cd86-dbe1-4674-9159-2e9dbcc6182c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827843 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f83d7b2d-d239-4476-b094-929bce4f8b20-config\") pod \"service-ca-operator-777779d784-7j4pk\" (UID: \"f83d7b2d-d239-4476-b094-929bce4f8b20\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827860 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90a94872-5a77-4279-a045-55a6abe7a781-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nqrqh\" (UID: \"90a94872-5a77-4279-a045-55a6abe7a781\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827895 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hvz7\" (UniqueName: \"kubernetes.io/projected/e7b1cd86-dbe1-4674-9159-2e9dbcc6182c-kube-api-access-9hvz7\") pod \"machine-config-controller-84d6567774-2h6h9\" (UID: \"e7b1cd86-dbe1-4674-9159-2e9dbcc6182c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827912 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2185f4f-1936-4335-9aab-67538f5e3888-config\") pod \"kube-controller-manager-operator-78b949d7b-58mwb\" (UID: \"d2185f4f-1936-4335-9aab-67538f5e3888\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827930 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb60121d-df03-4f88-a9e5-118105c6ce94-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8vr4f\" (UID: \"fb60121d-df03-4f88-a9e5-118105c6ce94\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8vr4f" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.827958 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/458c2ebd-ea67-4efc-b058-142de4fce612-registry-certificates\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828008 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-swbj6\" (UID: \"bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828025 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90a94872-5a77-4279-a045-55a6abe7a781-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nqrqh\" (UID: \"90a94872-5a77-4279-a045-55a6abe7a781\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828041 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/63c6e7d6-50c6-4a7e-b318-edbfa0496006-mountpoint-dir\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828057 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/458c2ebd-ea67-4efc-b058-142de4fce612-trusted-ca\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828073 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p4fr\" (UniqueName: \"kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-kube-api-access-9p4fr\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828090 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a6cf416d-49d9-4e74-846b-7d6923ae415f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cbr2m\" (UID: \"a6cf416d-49d9-4e74-846b-7d6923ae415f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828118 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-registry-tls\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828136 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cffdcaea-9ca5-44de-bb0b-83d3c8d1da60-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lqk6q\" (UID: \"cffdcaea-9ca5-44de-bb0b-83d3c8d1da60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828169 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-bound-sa-token\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828184 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a6cf416d-49d9-4e74-846b-7d6923ae415f-srv-cert\") pod \"olm-operator-6b444d44fb-cbr2m\" (UID: \"a6cf416d-49d9-4e74-846b-7d6923ae415f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828217 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01b5ca28-1828-40b6-97cf-093f8027dab3-config-volume\") pod \"collect-profiles-29401440-92l65\" (UID: \"01b5ca28-1828-40b6-97cf-093f8027dab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828235 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b022022e-00fd-43df-99b2-eb91f39e4264-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4dtjj\" (UID: \"b022022e-00fd-43df-99b2-eb91f39e4264\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828269 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w74fr\" (UniqueName: \"kubernetes.io/projected/8421cc57-3b6e-44b6-a732-75cce2177e80-kube-api-access-w74fr\") pod \"dns-operator-744455d44c-84rbl\" (UID: \"8421cc57-3b6e-44b6-a732-75cce2177e80\") " pod="openshift-dns-operator/dns-operator-744455d44c-84rbl" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828294 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/671bb022-0c33-494e-a398-9e980eb0d3fd-metrics-tls\") pod \"dns-default-5xxzs\" (UID: \"671bb022-0c33-494e-a398-9e980eb0d3fd\") " pod="openshift-dns/dns-default-5xxzs" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828320 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/63c6e7d6-50c6-4a7e-b318-edbfa0496006-csi-data-dir\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828342 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/63c6e7d6-50c6-4a7e-b318-edbfa0496006-plugins-dir\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828364 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/458c2ebd-ea67-4efc-b058-142de4fce612-installation-pull-secrets\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828410 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c99ef725-44cc-4a3f-9ee1-2df041f7a254-serving-cert\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828453 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/458c2ebd-ea67-4efc-b058-142de4fce612-ca-trust-extracted\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828478 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7b1cd86-dbe1-4674-9159-2e9dbcc6182c-proxy-tls\") pod \"machine-config-controller-84d6567774-2h6h9\" (UID: \"e7b1cd86-dbe1-4674-9159-2e9dbcc6182c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828503 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/63c6e7d6-50c6-4a7e-b318-edbfa0496006-registration-dir\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828538 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-swbj6\" (UID: \"bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828554 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cffdcaea-9ca5-44de-bb0b-83d3c8d1da60-proxy-tls\") pod \"machine-config-operator-74547568cd-lqk6q\" (UID: \"cffdcaea-9ca5-44de-bb0b-83d3c8d1da60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828571 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2jlb\" (UniqueName: \"kubernetes.io/projected/cffdcaea-9ca5-44de-bb0b-83d3c8d1da60-kube-api-access-c2jlb\") pod \"machine-config-operator-74547568cd-lqk6q\" (UID: \"cffdcaea-9ca5-44de-bb0b-83d3c8d1da60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828586 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c99ef725-44cc-4a3f-9ee1-2df041f7a254-etcd-service-ca\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828624 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48fh5\" (UniqueName: \"kubernetes.io/projected/a6cf416d-49d9-4e74-846b-7d6923ae415f-kube-api-access-48fh5\") pod \"olm-operator-6b444d44fb-cbr2m\" (UID: \"a6cf416d-49d9-4e74-846b-7d6923ae415f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828658 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhn7q\" (UniqueName: \"kubernetes.io/projected/fb60121d-df03-4f88-a9e5-118105c6ce94-kube-api-access-vhn7q\") pod \"control-plane-machine-set-operator-78cbb6b69f-8vr4f\" (UID: \"fb60121d-df03-4f88-a9e5-118105c6ce94\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8vr4f" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2185f4f-1936-4335-9aab-67538f5e3888-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-58mwb\" (UID: \"d2185f4f-1936-4335-9aab-67538f5e3888\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828718 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/671bb022-0c33-494e-a398-9e980eb0d3fd-config-volume\") pod \"dns-default-5xxzs\" (UID: \"671bb022-0c33-494e-a398-9e980eb0d3fd\") " pod="openshift-dns/dns-default-5xxzs" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828733 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e411e757-a7ba-45fe-b248-8eca5d468049-node-bootstrap-token\") pod \"machine-config-server-xbdmm\" (UID: \"e411e757-a7ba-45fe-b248-8eca5d468049\") " pod="openshift-machine-config-operator/machine-config-server-xbdmm" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828748 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e411e757-a7ba-45fe-b248-8eca5d468049-certs\") pod \"machine-config-server-xbdmm\" (UID: \"e411e757-a7ba-45fe-b248-8eca5d468049\") " pod="openshift-machine-config-operator/machine-config-server-xbdmm" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828775 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01b5ca28-1828-40b6-97cf-093f8027dab3-secret-volume\") pod \"collect-profiles-29401440-92l65\" (UID: \"01b5ca28-1828-40b6-97cf-093f8027dab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828791 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687-config\") pod \"kube-apiserver-operator-766d6c64bb-swbj6\" (UID: \"bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828807 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90a94872-5a77-4279-a045-55a6abe7a781-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nqrqh\" (UID: \"90a94872-5a77-4279-a045-55a6abe7a781\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828831 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwb2p\" (UniqueName: \"kubernetes.io/projected/f83d7b2d-d239-4476-b094-929bce4f8b20-kube-api-access-wwb2p\") pod \"service-ca-operator-777779d784-7j4pk\" (UID: \"f83d7b2d-d239-4476-b094-929bce4f8b20\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828855 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c99ef725-44cc-4a3f-9ee1-2df041f7a254-etcd-client\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828881 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxzc5\" (UniqueName: \"kubernetes.io/projected/671bb022-0c33-494e-a398-9e980eb0d3fd-kube-api-access-lxzc5\") pod \"dns-default-5xxzs\" (UID: \"671bb022-0c33-494e-a398-9e980eb0d3fd\") " pod="openshift-dns/dns-default-5xxzs" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828898 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2185f4f-1936-4335-9aab-67538f5e3888-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-58mwb\" (UID: \"d2185f4f-1936-4335-9aab-67538f5e3888\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828915 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b022022e-00fd-43df-99b2-eb91f39e4264-metrics-tls\") pod \"ingress-operator-5b745b69d9-4dtjj\" (UID: \"b022022e-00fd-43df-99b2-eb91f39e4264\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828950 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cffdcaea-9ca5-44de-bb0b-83d3c8d1da60-images\") pod \"machine-config-operator-74547568cd-lqk6q\" (UID: \"cffdcaea-9ca5-44de-bb0b-83d3c8d1da60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828967 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdfgh\" (UniqueName: \"kubernetes.io/projected/01b5ca28-1828-40b6-97cf-093f8027dab3-kube-api-access-hdfgh\") pod \"collect-profiles-29401440-92l65\" (UID: \"01b5ca28-1828-40b6-97cf-093f8027dab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.828991 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f83d7b2d-d239-4476-b094-929bce4f8b20-serving-cert\") pod \"service-ca-operator-777779d784-7j4pk\" (UID: \"f83d7b2d-d239-4476-b094-929bce4f8b20\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.829007 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8421cc57-3b6e-44b6-a732-75cce2177e80-metrics-tls\") pod \"dns-operator-744455d44c-84rbl\" (UID: \"8421cc57-3b6e-44b6-a732-75cce2177e80\") " pod="openshift-dns-operator/dns-operator-744455d44c-84rbl" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.829408 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cffdcaea-9ca5-44de-bb0b-83d3c8d1da60-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lqk6q\" (UID: \"cffdcaea-9ca5-44de-bb0b-83d3c8d1da60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" Nov 25 16:01:02 crc kubenswrapper[4743]: E1125 16:01:02.830246 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:03.330224467 +0000 UTC m=+142.452064006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.833388 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b022022e-00fd-43df-99b2-eb91f39e4264-trusted-ca\") pod \"ingress-operator-5b745b69d9-4dtjj\" (UID: \"b022022e-00fd-43df-99b2-eb91f39e4264\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.834161 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f83d7b2d-d239-4476-b094-929bce4f8b20-config\") pod \"service-ca-operator-777779d784-7j4pk\" (UID: \"f83d7b2d-d239-4476-b094-929bce4f8b20\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.834633 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c99ef725-44cc-4a3f-9ee1-2df041f7a254-etcd-ca\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.834912 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.835495 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a6cf416d-49d9-4e74-846b-7d6923ae415f-srv-cert\") pod \"olm-operator-6b444d44fb-cbr2m\" (UID: \"a6cf416d-49d9-4e74-846b-7d6923ae415f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.836255 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c99ef725-44cc-4a3f-9ee1-2df041f7a254-config\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.836825 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e7b1cd86-dbe1-4674-9159-2e9dbcc6182c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2h6h9\" (UID: \"e7b1cd86-dbe1-4674-9159-2e9dbcc6182c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.837342 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/458c2ebd-ea67-4efc-b058-142de4fce612-trusted-ca\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.838500 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2185f4f-1936-4335-9aab-67538f5e3888-config\") pod \"kube-controller-manager-operator-78b949d7b-58mwb\" (UID: \"d2185f4f-1936-4335-9aab-67538f5e3888\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.842963 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/458c2ebd-ea67-4efc-b058-142de4fce612-registry-certificates\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.844728 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f83d7b2d-d239-4476-b094-929bce4f8b20-serving-cert\") pod \"service-ca-operator-777779d784-7j4pk\" (UID: \"f83d7b2d-d239-4476-b094-929bce4f8b20\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.845058 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c99ef725-44cc-4a3f-9ee1-2df041f7a254-etcd-service-ca\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.845230 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2185f4f-1936-4335-9aab-67538f5e3888-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-58mwb\" (UID: \"d2185f4f-1936-4335-9aab-67538f5e3888\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.846253 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/458c2ebd-ea67-4efc-b058-142de4fce612-ca-trust-extracted\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.846529 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01b5ca28-1828-40b6-97cf-093f8027dab3-config-volume\") pod \"collect-profiles-29401440-92l65\" (UID: \"01b5ca28-1828-40b6-97cf-093f8027dab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.849528 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.851260 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90a94872-5a77-4279-a045-55a6abe7a781-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nqrqh\" (UID: \"90a94872-5a77-4279-a045-55a6abe7a781\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.852275 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb60121d-df03-4f88-a9e5-118105c6ce94-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8vr4f\" (UID: \"fb60121d-df03-4f88-a9e5-118105c6ce94\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8vr4f" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.854198 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687-config\") pod \"kube-apiserver-operator-766d6c64bb-swbj6\" (UID: \"bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.856392 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90a94872-5a77-4279-a045-55a6abe7a781-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nqrqh\" (UID: \"90a94872-5a77-4279-a045-55a6abe7a781\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.857012 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01b5ca28-1828-40b6-97cf-093f8027dab3-secret-volume\") pod \"collect-profiles-29401440-92l65\" (UID: \"01b5ca28-1828-40b6-97cf-093f8027dab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.860728 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6mhkh" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.862464 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cffdcaea-9ca5-44de-bb0b-83d3c8d1da60-proxy-tls\") pod \"machine-config-operator-74547568cd-lqk6q\" (UID: \"cffdcaea-9ca5-44de-bb0b-83d3c8d1da60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.866142 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c99ef725-44cc-4a3f-9ee1-2df041f7a254-serving-cert\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.872260 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a6cf416d-49d9-4e74-846b-7d6923ae415f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cbr2m\" (UID: \"a6cf416d-49d9-4e74-846b-7d6923ae415f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.873144 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8421cc57-3b6e-44b6-a732-75cce2177e80-metrics-tls\") pod \"dns-operator-744455d44c-84rbl\" (UID: \"8421cc57-3b6e-44b6-a732-75cce2177e80\") " pod="openshift-dns-operator/dns-operator-744455d44c-84rbl" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.874113 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-swbj6\" (UID: \"bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.877274 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.877540 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c99ef725-44cc-4a3f-9ee1-2df041f7a254-etcd-client\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.878820 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/458c2ebd-ea67-4efc-b058-142de4fce612-installation-pull-secrets\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.881112 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b022022e-00fd-43df-99b2-eb91f39e4264-metrics-tls\") pod \"ingress-operator-5b745b69d9-4dtjj\" (UID: \"b022022e-00fd-43df-99b2-eb91f39e4264\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.883397 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv"] Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.885380 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-registry-tls\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.886760 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7b1cd86-dbe1-4674-9159-2e9dbcc6182c-proxy-tls\") pod \"machine-config-controller-84d6567774-2h6h9\" (UID: \"e7b1cd86-dbe1-4674-9159-2e9dbcc6182c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.892429 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-bound-sa-token\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.899839 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2185f4f-1936-4335-9aab-67538f5e3888-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-58mwb\" (UID: \"d2185f4f-1936-4335-9aab-67538f5e3888\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.909482 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4wvzk" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.917114 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hpbg\" (UniqueName: \"kubernetes.io/projected/c99ef725-44cc-4a3f-9ee1-2df041f7a254-kube-api-access-6hpbg\") pod \"etcd-operator-b45778765-n2vkg\" (UID: \"c99ef725-44cc-4a3f-9ee1-2df041f7a254\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.930067 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/63c6e7d6-50c6-4a7e-b318-edbfa0496006-registration-dir\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.930152 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/671bb022-0c33-494e-a398-9e980eb0d3fd-config-volume\") pod \"dns-default-5xxzs\" (UID: \"671bb022-0c33-494e-a398-9e980eb0d3fd\") " pod="openshift-dns/dns-default-5xxzs" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.930172 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e411e757-a7ba-45fe-b248-8eca5d468049-node-bootstrap-token\") pod \"machine-config-server-xbdmm\" (UID: \"e411e757-a7ba-45fe-b248-8eca5d468049\") " pod="openshift-machine-config-operator/machine-config-server-xbdmm" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.930189 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e411e757-a7ba-45fe-b248-8eca5d468049-certs\") pod \"machine-config-server-xbdmm\" (UID: \"e411e757-a7ba-45fe-b248-8eca5d468049\") " pod="openshift-machine-config-operator/machine-config-server-xbdmm" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.930220 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxzc5\" (UniqueName: \"kubernetes.io/projected/671bb022-0c33-494e-a398-9e980eb0d3fd-kube-api-access-lxzc5\") pod \"dns-default-5xxzs\" (UID: \"671bb022-0c33-494e-a398-9e980eb0d3fd\") " pod="openshift-dns/dns-default-5xxzs" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.930262 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/63c6e7d6-50c6-4a7e-b318-edbfa0496006-socket-dir\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.930281 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snwdg\" (UniqueName: \"kubernetes.io/projected/e411e757-a7ba-45fe-b248-8eca5d468049-kube-api-access-snwdg\") pod \"machine-config-server-xbdmm\" (UID: \"e411e757-a7ba-45fe-b248-8eca5d468049\") " pod="openshift-machine-config-operator/machine-config-server-xbdmm" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.930303 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xzwv\" (UniqueName: \"kubernetes.io/projected/63c6e7d6-50c6-4a7e-b318-edbfa0496006-kube-api-access-2xzwv\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.930366 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/63c6e7d6-50c6-4a7e-b318-edbfa0496006-mountpoint-dir\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.930415 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/671bb022-0c33-494e-a398-9e980eb0d3fd-metrics-tls\") pod \"dns-default-5xxzs\" (UID: \"671bb022-0c33-494e-a398-9e980eb0d3fd\") " pod="openshift-dns/dns-default-5xxzs" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.930440 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/63c6e7d6-50c6-4a7e-b318-edbfa0496006-csi-data-dir\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.930455 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/63c6e7d6-50c6-4a7e-b318-edbfa0496006-plugins-dir\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.930484 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:02 crc kubenswrapper[4743]: E1125 16:01:02.930909 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:03.430889996 +0000 UTC m=+142.552729545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.931252 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/63c6e7d6-50c6-4a7e-b318-edbfa0496006-socket-dir\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.931482 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/63c6e7d6-50c6-4a7e-b318-edbfa0496006-csi-data-dir\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.931499 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/63c6e7d6-50c6-4a7e-b318-edbfa0496006-plugins-dir\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.931569 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/63c6e7d6-50c6-4a7e-b318-edbfa0496006-registration-dir\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.931796 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/63c6e7d6-50c6-4a7e-b318-edbfa0496006-mountpoint-dir\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.937292 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx72r\" (UniqueName: \"kubernetes.io/projected/b022022e-00fd-43df-99b2-eb91f39e4264-kube-api-access-tx72r\") pod \"ingress-operator-5b745b69d9-4dtjj\" (UID: \"b022022e-00fd-43df-99b2-eb91f39e4264\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.939054 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/671bb022-0c33-494e-a398-9e980eb0d3fd-metrics-tls\") pod \"dns-default-5xxzs\" (UID: \"671bb022-0c33-494e-a398-9e980eb0d3fd\") " pod="openshift-dns/dns-default-5xxzs" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.960549 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.972531 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48fh5\" (UniqueName: \"kubernetes.io/projected/a6cf416d-49d9-4e74-846b-7d6923ae415f-kube-api-access-48fh5\") pod \"olm-operator-6b444d44fb-cbr2m\" (UID: \"a6cf416d-49d9-4e74-846b-7d6923ae415f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.976496 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhn7q\" (UniqueName: \"kubernetes.io/projected/fb60121d-df03-4f88-a9e5-118105c6ce94-kube-api-access-vhn7q\") pod \"control-plane-machine-set-operator-78cbb6b69f-8vr4f\" (UID: \"fb60121d-df03-4f88-a9e5-118105c6ce94\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8vr4f" Nov 25 16:01:02 crc kubenswrapper[4743]: I1125 16:01:02.992684 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.002772 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/671bb022-0c33-494e-a398-9e980eb0d3fd-config-volume\") pod \"dns-default-5xxzs\" (UID: \"671bb022-0c33-494e-a398-9e980eb0d3fd\") " pod="openshift-dns/dns-default-5xxzs" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.012623 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e411e757-a7ba-45fe-b248-8eca5d468049-certs\") pod \"machine-config-server-xbdmm\" (UID: \"e411e757-a7ba-45fe-b248-8eca5d468049\") " pod="openshift-machine-config-operator/machine-config-server-xbdmm" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.013158 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e411e757-a7ba-45fe-b248-8eca5d468049-node-bootstrap-token\") pod \"machine-config-server-xbdmm\" (UID: \"e411e757-a7ba-45fe-b248-8eca5d468049\") " pod="openshift-machine-config-operator/machine-config-server-xbdmm" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.016407 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-swbj6\" (UID: \"bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.017632 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.017947 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cffdcaea-9ca5-44de-bb0b-83d3c8d1da60-images\") pod \"machine-config-operator-74547568cd-lqk6q\" (UID: \"cffdcaea-9ca5-44de-bb0b-83d3c8d1da60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.023533 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4sghb"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.028787 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.029219 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90a94872-5a77-4279-a045-55a6abe7a781-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nqrqh\" (UID: \"90a94872-5a77-4279-a045-55a6abe7a781\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.032047 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:03 crc kubenswrapper[4743]: E1125 16:01:03.032538 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:03.532511851 +0000 UTC m=+142.654351420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.041412 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hvz7\" (UniqueName: \"kubernetes.io/projected/e7b1cd86-dbe1-4674-9159-2e9dbcc6182c-kube-api-access-9hvz7\") pod \"machine-config-controller-84d6567774-2h6h9\" (UID: \"e7b1cd86-dbe1-4674-9159-2e9dbcc6182c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.050496 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdfgh\" (UniqueName: \"kubernetes.io/projected/01b5ca28-1828-40b6-97cf-093f8027dab3-kube-api-access-hdfgh\") pod \"collect-profiles-29401440-92l65\" (UID: \"01b5ca28-1828-40b6-97cf-093f8027dab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.055502 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.071627 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w74fr\" (UniqueName: \"kubernetes.io/projected/8421cc57-3b6e-44b6-a732-75cce2177e80-kube-api-access-w74fr\") pod \"dns-operator-744455d44c-84rbl\" (UID: \"8421cc57-3b6e-44b6-a732-75cce2177e80\") " pod="openshift-dns-operator/dns-operator-744455d44c-84rbl" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.094479 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2jlb\" (UniqueName: \"kubernetes.io/projected/cffdcaea-9ca5-44de-bb0b-83d3c8d1da60-kube-api-access-c2jlb\") pod \"machine-config-operator-74547568cd-lqk6q\" (UID: \"cffdcaea-9ca5-44de-bb0b-83d3c8d1da60\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.110752 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p4fr\" (UniqueName: \"kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-kube-api-access-9p4fr\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.113066 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.121171 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.133636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b022022e-00fd-43df-99b2-eb91f39e4264-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4dtjj\" (UID: \"b022022e-00fd-43df-99b2-eb91f39e4264\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.140264 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:03 crc kubenswrapper[4743]: E1125 16:01:03.140613 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:03.640580439 +0000 UTC m=+142.762419988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.140841 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-84rbl" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.151499 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwb2p\" (UniqueName: \"kubernetes.io/projected/f83d7b2d-d239-4476-b094-929bce4f8b20-kube-api-access-wwb2p\") pod \"service-ca-operator-777779d784-7j4pk\" (UID: \"f83d7b2d-d239-4476-b094-929bce4f8b20\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.170121 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.179493 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5n9pg"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.194369 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gwtwj"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.199458 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-w8pxz"] Nov 25 16:01:03 crc kubenswrapper[4743]: W1125 16:01:03.200835 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89cbcba8_bb80_4471_b57f_55cd045b68d9.slice/crio-1c58d88aa3b21ff1a9f786abcf9e8adb49bc6735194e812409e9217d0a0a0fbb WatchSource:0}: Error finding container 1c58d88aa3b21ff1a9f786abcf9e8adb49bc6735194e812409e9217d0a0a0fbb: Status 404 returned error can't find the container with id 1c58d88aa3b21ff1a9f786abcf9e8adb49bc6735194e812409e9217d0a0a0fbb Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.206059 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" Nov 25 16:01:03 crc kubenswrapper[4743]: W1125 16:01:03.208407 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod090b15f7_96ff_4c51_ac41_59d1ed0c66a7.slice/crio-374cab5a6bb5b827badd033f2653a88c66c9062292b41aed5245c02770fc1173 WatchSource:0}: Error finding container 374cab5a6bb5b827badd033f2653a88c66c9062292b41aed5245c02770fc1173: Status 404 returned error can't find the container with id 374cab5a6bb5b827badd033f2653a88c66c9062292b41aed5245c02770fc1173 Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.216539 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snwdg\" (UniqueName: \"kubernetes.io/projected/e411e757-a7ba-45fe-b248-8eca5d468049-kube-api-access-snwdg\") pod \"machine-config-server-xbdmm\" (UID: \"e411e757-a7ba-45fe-b248-8eca5d468049\") " pod="openshift-machine-config-operator/machine-config-server-xbdmm" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.221931 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.231053 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.238949 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xzwv\" (UniqueName: \"kubernetes.io/projected/63c6e7d6-50c6-4a7e-b318-edbfa0496006-kube-api-access-2xzwv\") pod \"csi-hostpathplugin-bl29m\" (UID: \"63c6e7d6-50c6-4a7e-b318-edbfa0496006\") " pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.243541 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.246368 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8vr4f" Nov 25 16:01:03 crc kubenswrapper[4743]: E1125 16:01:03.246507 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:03.746490456 +0000 UTC m=+142.868330005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.246576 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:03 crc kubenswrapper[4743]: E1125 16:01:03.247243 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:03.747223787 +0000 UTC m=+142.869063336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.289410 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxzc5\" (UniqueName: \"kubernetes.io/projected/671bb022-0c33-494e-a398-9e980eb0d3fd-kube-api-access-lxzc5\") pod \"dns-default-5xxzs\" (UID: \"671bb022-0c33-494e-a398-9e980eb0d3fd\") " pod="openshift-dns/dns-default-5xxzs" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.322412 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bl29m" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.328745 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xbdmm" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.332282 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.337356 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5xxzs" Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.348164 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:03 crc kubenswrapper[4743]: E1125 16:01:03.348699 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:03.848673608 +0000 UTC m=+142.970513157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.349232 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:03 crc kubenswrapper[4743]: E1125 16:01:03.349861 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:03.849850751 +0000 UTC m=+142.971690300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.462188 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:03 crc kubenswrapper[4743]: E1125 16:01:03.462852 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:03.962815557 +0000 UTC m=+143.084655106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.508708 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-55g75" event={"ID":"2b4b5943-89e2-483d-a034-1344fec03f98","Type":"ContainerStarted","Data":"be742e39b251a11effbc3b6bccaceca2e0e29b4349dd17655980dad526949a0f"} Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.509870 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zjh85"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.509941 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" event={"ID":"fc5c20fa-6555-4502-8d1e-620e985c9607","Type":"ContainerStarted","Data":"0ca53131ac3d3caed05832d762d9adffe95fe797cb8ae4973ecfb28cbfb430df"} Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.511201 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" event={"ID":"c5593b1c-91e9-48c0-b348-cd0a46f64639","Type":"ContainerStarted","Data":"583009a3548beff1624b9b1a3c4244fd7c5c08419eed47ee04c75dbc46d63a2d"} Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.511241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" event={"ID":"c5593b1c-91e9-48c0-b348-cd0a46f64639","Type":"ContainerStarted","Data":"3998ff0d5ccaadb4e3573a68f7642b085bffccabe932e1f016b791d01069cfb1"} Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.512740 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4sghb" event={"ID":"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0","Type":"ContainerStarted","Data":"25649e0aea91d895fa9ebc882dccb6c14b84714fc7d19306819b3cef8660de1c"} Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.513982 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" event={"ID":"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb","Type":"ContainerStarted","Data":"61f5f3ce9f39ef7693e4d53536e234b6651396b8ad540515d2edf721a204b6c1"} Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.520065 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ldzq4" event={"ID":"505a2c86-f87d-4179-9156-7c6b98ba9b84","Type":"ContainerStarted","Data":"b8e0da675de209008d715da06b09538de069ead19f98df691e0d1858fed099b8"} Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.528996 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf" event={"ID":"766093a7-4c21-4cd4-b8b1-074140a620c9","Type":"ContainerStarted","Data":"0f0ebf82278579894d56e8c3bc14592a2dc93cd8ce75cbabd9f723931506ee9e"} Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.530655 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-j9hhf"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.530721 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" event={"ID":"b7e20dd3-f239-419d-bc24-5e38d66e7803","Type":"ContainerStarted","Data":"425ae9c853d5ab80fd7c0249df4c19af14b2530b615888fa56915c5ec9ce9968"} Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.532737 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q" event={"ID":"89cbcba8-bb80-4471-b57f-55cd045b68d9","Type":"ContainerStarted","Data":"1c58d88aa3b21ff1a9f786abcf9e8adb49bc6735194e812409e9217d0a0a0fbb"} Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.534745 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" event={"ID":"090b15f7-96ff-4c51-ac41-59d1ed0c66a7","Type":"ContainerStarted","Data":"374cab5a6bb5b827badd033f2653a88c66c9062292b41aed5245c02770fc1173"} Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.543020 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" event={"ID":"7d64e696-446f-4e5d-a276-4a9f18f291b2","Type":"ContainerStarted","Data":"4ade0166c5f8c32b9a3cc84bc421f4776c9be9bfa72e5d0f32ff1f928147f531"} Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.549912 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" event={"ID":"2e800807-1cef-4dcb-9001-48322127beb9","Type":"ContainerStarted","Data":"fd6e47d2ae17f8d7662fae77bb4c1321c84f256995e1c30e526d533eed3f09b5"} Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.564114 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:03 crc kubenswrapper[4743]: E1125 16:01:03.564568 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:04.064553185 +0000 UTC m=+143.186392734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.640902 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pbz24"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.640957 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6mhkh"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.640969 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.648740 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-krv4b"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.652287 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5"] Nov 25 16:01:03 crc kubenswrapper[4743]: W1125 16:01:03.653962 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda655856c_3900_4342_a094_dc03b84c8876.slice/crio-8f0bffd353d4a337791d31c671af9c9be6a6d8d1a3a01f2469b9a6461248b0a0 WatchSource:0}: Error finding container 8f0bffd353d4a337791d31c671af9c9be6a6d8d1a3a01f2469b9a6461248b0a0: Status 404 returned error can't find the container with id 8f0bffd353d4a337791d31c671af9c9be6a6d8d1a3a01f2469b9a6461248b0a0 Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.653998 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55"] Nov 25 16:01:03 crc kubenswrapper[4743]: W1125 16:01:03.660730 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c219812_f1dd_44da_9a23_764167668a0f.slice/crio-f8bcabd5954baf30d8df813e07c1bb93ce502ca8d5b521bd6bbf568841885bf0 WatchSource:0}: Error finding container f8bcabd5954baf30d8df813e07c1bb93ce502ca8d5b521bd6bbf568841885bf0: Status 404 returned error can't find the container with id f8bcabd5954baf30d8df813e07c1bb93ce502ca8d5b521bd6bbf568841885bf0 Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.665755 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:03 crc kubenswrapper[4743]: E1125 16:01:03.665973 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:04.165939745 +0000 UTC m=+143.287779294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.666097 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:03 crc kubenswrapper[4743]: E1125 16:01:03.666991 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:04.166966574 +0000 UTC m=+143.288806123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:03 crc kubenswrapper[4743]: W1125 16:01:03.683687 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaeafcba_9592_4136_9893_4bf3b9295041.slice/crio-2e803c23206fbb200c00facd5c5fad70f4f1204969966ca692b936825510303a WatchSource:0}: Error finding container 2e803c23206fbb200c00facd5c5fad70f4f1204969966ca692b936825510303a: Status 404 returned error can't find the container with id 2e803c23206fbb200c00facd5c5fad70f4f1204969966ca692b936825510303a Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.742448 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-n2vkg"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.770060 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:03 crc kubenswrapper[4743]: E1125 16:01:03.770408 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:04.270392721 +0000 UTC m=+143.392232270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.863484 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.863525 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.863538 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.863551 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tjsl5"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.863564 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.863575 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.870692 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4wvzk"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.871513 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:03 crc kubenswrapper[4743]: E1125 16:01:03.872036 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:04.372013847 +0000 UTC m=+143.493853396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.883647 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2gkgd"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.885404 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q"] Nov 25 16:01:03 crc kubenswrapper[4743]: I1125 16:01:03.973011 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:03 crc kubenswrapper[4743]: E1125 16:01:03.974521 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:04.474489717 +0000 UTC m=+143.596329266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.006850 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65"] Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.077574 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:04 crc kubenswrapper[4743]: E1125 16:01:04.077896 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:04.577882613 +0000 UTC m=+143.699722162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.123555 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rdvpf" podStartSLOduration=120.12353909 podStartE2EDuration="2m0.12353909s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:04.122646825 +0000 UTC m=+143.244486374" watchObservedRunningTime="2025-11-25 16:01:04.12353909 +0000 UTC m=+143.245378639" Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.178385 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:04 crc kubenswrapper[4743]: E1125 16:01:04.179203 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:04.679142868 +0000 UTC m=+143.800982427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.179451 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:04 crc kubenswrapper[4743]: E1125 16:01:04.179841 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:04.679834008 +0000 UTC m=+143.801673547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.280428 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:04 crc kubenswrapper[4743]: E1125 16:01:04.280654 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:04.780633161 +0000 UTC m=+143.902472700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.280768 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:04 crc kubenswrapper[4743]: E1125 16:01:04.281151 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:04.781140564 +0000 UTC m=+143.902980113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.381661 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:04 crc kubenswrapper[4743]: E1125 16:01:04.382563 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:04.882543334 +0000 UTC m=+144.004382883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.483468 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:04 crc kubenswrapper[4743]: E1125 16:01:04.483876 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:04.983861112 +0000 UTC m=+144.105700661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.553740 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj"] Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.567754 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" event={"ID":"090b15f7-96ff-4c51-ac41-59d1ed0c66a7","Type":"ContainerStarted","Data":"5a9fe37855d0ab0072391aee23b329e465b8496ddae620463dacd23e39e1501f"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.573482 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb"] Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.574687 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" event={"ID":"cffdcaea-9ca5-44de-bb0b-83d3c8d1da60","Type":"ContainerStarted","Data":"095fa80362c918cf31c7b2a42cdde17dccf98f3b74843cd2abba83f1f2c502de"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.578983 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6" event={"ID":"bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687","Type":"ContainerStarted","Data":"dfb87c51d8df7a4cd921f402098a870e3d69f6605058df56423b6da1095780c4"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.581146 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8vr4f"] Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.585931 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:04 crc kubenswrapper[4743]: E1125 16:01:04.587352 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:05.087292199 +0000 UTC m=+144.209131748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.590650 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh" event={"ID":"90a94872-5a77-4279-a045-55a6abe7a781","Type":"ContainerStarted","Data":"b97504bdeefc0c88a3a118339bf99f8ba931aab79259099a007751aa788e2690"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.599143 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:04 crc kubenswrapper[4743]: E1125 16:01:04.600690 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:05.100660385 +0000 UTC m=+144.222499934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.610786 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-55g75" event={"ID":"2b4b5943-89e2-483d-a034-1344fec03f98","Type":"ContainerStarted","Data":"2e9c909c36c637d8897ce0d1327b548aa0f3ec0f3146508afbd57304f8457361"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.611446 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-55g75" Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.612628 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-55g75 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.612665 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-55g75" podUID="2b4b5943-89e2-483d-a034-1344fec03f98" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.616759 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" event={"ID":"95c66716-8eaa-4e63-a30d-5b871fffd090","Type":"ContainerStarted","Data":"078d51b9ff0893f15bbf9ffefdf67ed039b5a55f274cfb056e0f65a68f69771f"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.627395 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zjh85" event={"ID":"8c219812-f1dd-44da-9a23-764167668a0f","Type":"ContainerStarted","Data":"f8bcabd5954baf30d8df813e07c1bb93ce502ca8d5b521bd6bbf568841885bf0"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.634426 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" event={"ID":"2e800807-1cef-4dcb-9001-48322127beb9","Type":"ContainerStarted","Data":"7f1eb3b73cac8b2c2a6bf1e4cbc4bd707b96ee2d88cd49d2b404034c73ef3e66"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.635033 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.636804 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xbdmm" event={"ID":"e411e757-a7ba-45fe-b248-8eca5d468049","Type":"ContainerStarted","Data":"7d43a472cf4c3bf7957794698ec5911281c136952d3286f7c97e93eb09aa7b75"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.642357 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" event={"ID":"50460f60-2828-42ab-94aa-3ae9d13a5a1e","Type":"ContainerStarted","Data":"a3cbb50472f90a829dcff7c9dd19a0e5b0fbd1001a0ffa69cb5fecb0ef28e219"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.651378 4743 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5n9pg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.651468 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" podUID="2e800807-1cef-4dcb-9001-48322127beb9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.655274 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk"] Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.683480 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bl29m"] Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.687606 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-krv4b" event={"ID":"c3569861-ce7b-4e88-a5ec-77a5ea7e995e","Type":"ContainerStarted","Data":"f615a345553ad30272fec62cdda8e1845bac6039b91e81f69f5532d222d21fc1"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.689194 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4sghb" event={"ID":"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0","Type":"ContainerStarted","Data":"da5209388ccb33b51f2d9f51875b705027023931afe568eea189ee9efc5ae16e"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.692626 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" event={"ID":"baeafcba-9592-4136-9893-4bf3b9295041","Type":"ContainerStarted","Data":"2e803c23206fbb200c00facd5c5fad70f4f1204969966ca692b936825510303a"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.693859 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ldzq4" event={"ID":"505a2c86-f87d-4179-9156-7c6b98ba9b84","Type":"ContainerStarted","Data":"b862fd759fa99d4194ceeb44ea0cc759d68c3095bd9826478b7aa05b88fba44d"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.700477 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-84rbl"] Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.700522 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" event={"ID":"01b5ca28-1828-40b6-97cf-093f8027dab3","Type":"ContainerStarted","Data":"c85e15f2c5682c7b56b293b8eb4f040b8cdf644d412a7ea7d49bf0fc8180a68b"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.700986 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.702034 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9"] Nov 25 16:01:04 crc kubenswrapper[4743]: E1125 16:01:04.702153 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:05.202135828 +0000 UTC m=+144.323975377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.715682 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" event={"ID":"b7e20dd3-f239-419d-bc24-5e38d66e7803","Type":"ContainerStarted","Data":"a80d0c6bb3601fff2633ab439874e613fa6713242d7f507e2e575088498b5876"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.716564 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.725813 4743 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-w8pxz container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" start-of-body= Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.725884 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" Nov 25 16:01:04 crc kubenswrapper[4743]: W1125 16:01:04.731989 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb60121d_df03_4f88_a9e5_118105c6ce94.slice/crio-f281c785749079aa9cefca5e4ed0ee64cbb8b90a3895c43b3f70af976c1786cc WatchSource:0}: Error finding container f281c785749079aa9cefca5e4ed0ee64cbb8b90a3895c43b3f70af976c1786cc: Status 404 returned error can't find the container with id f281c785749079aa9cefca5e4ed0ee64cbb8b90a3895c43b3f70af976c1786cc Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.744304 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5xxzs"] Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.759413 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" event={"ID":"fc5c20fa-6555-4502-8d1e-620e985c9607","Type":"ContainerStarted","Data":"048a210d460a6ee210511f05eaf7b967a8bb014d4a9931527e94c86a1ead206d"} Nov 25 16:01:04 crc kubenswrapper[4743]: W1125 16:01:04.760406 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf83d7b2d_d239_4476_b094_929bce4f8b20.slice/crio-7081f0fcd121923c1b5f892b586e47a884ffa0b1f1203aebb78729f5e29472d5 WatchSource:0}: Error finding container 7081f0fcd121923c1b5f892b586e47a884ffa0b1f1203aebb78729f5e29472d5: Status 404 returned error can't find the container with id 7081f0fcd121923c1b5f892b586e47a884ffa0b1f1203aebb78729f5e29472d5 Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.765746 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" event={"ID":"a655856c-3900-4342-a094-dc03b84c8876","Type":"ContainerStarted","Data":"8f0bffd353d4a337791d31c671af9c9be6a6d8d1a3a01f2469b9a6461248b0a0"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.772516 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" event={"ID":"c99ef725-44cc-4a3f-9ee1-2df041f7a254","Type":"ContainerStarted","Data":"a8cd901f6c019a0e117162994c3208b1da33bce8a1223f397e9cbaaf983bb795"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.774036 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" event={"ID":"a231a0eb-9935-44d2-abf0-733aa2d944a6","Type":"ContainerStarted","Data":"c6fd62de3bb9269f4068870045708084fe6f257494d27d3ab28aff2a07aaf4bd"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.777167 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2" event={"ID":"3b2a5449-98a1-46dd-893d-7a7b8e5bf0de","Type":"ContainerStarted","Data":"22b2c5701b0ada58e6b343360327efd75bab98179c24bcccdee7ec074493e190"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.787221 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" event={"ID":"125e1f77-3db7-4893-8127-fcd74903a65b","Type":"ContainerStarted","Data":"8febd8dc009f531fdba06c7444d365f3eb145191dde99fb5daedd891dba79133"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.797226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" event={"ID":"a6cf416d-49d9-4e74-846b-7d6923ae415f","Type":"ContainerStarted","Data":"ec9e725edef63567c8e6ad7636e39b0744b39cb9604627530d47d26174fe20c3"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.801686 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q" event={"ID":"89cbcba8-bb80-4471-b57f-55cd045b68d9","Type":"ContainerStarted","Data":"79bde2e207d390c1a2121fc1701729c92d1bd9b74d3fbd9c16d66d2af958b9e2"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.804409 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:04 crc kubenswrapper[4743]: E1125 16:01:04.807879 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:05.307857849 +0000 UTC m=+144.429697598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.812526 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6mhkh" event={"ID":"a691b039-6ec5-4ba1-b64a-badfaaff730e","Type":"ContainerStarted","Data":"bd3604fa7d994e57637b81166cbb33b07ecaae5e03ca74b390e790241efb62e5"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.812584 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6mhkh" event={"ID":"a691b039-6ec5-4ba1-b64a-badfaaff730e","Type":"ContainerStarted","Data":"28b7a01bf9bfa414f6a06bbefadc068c2ffe2342c9831e6d3a3d525d052d122d"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.814781 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55" event={"ID":"954c73bd-5c98-4909-a01b-f28ca7be011e","Type":"ContainerStarted","Data":"cf83000691dbd6673b624fe7d6c002f4c950d28c20ba26ff3f54592922a2629e"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.818498 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2gkgd" event={"ID":"e6f5cab6-00e5-44e0-8213-0754e21b2cb4","Type":"ContainerStarted","Data":"6cab75031195461498e508bca46ecd94247c1a01235e3cb7189ace0bcb61b22d"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.820455 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4wvzk" event={"ID":"150f435e-dbbf-4106-bc06-046dd7abb405","Type":"ContainerStarted","Data":"652cf3d74a1e4e403490140a7960904a13946bc1c2746962800d8d6c0dbe038c"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.822472 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qph9q" event={"ID":"6007cac9-0c7c-4d71-b65c-aa10735ecce4","Type":"ContainerStarted","Data":"1d4ecbbfab43d0eae95af9395b9d74f49a172263877b8d19b6aa6854e76569fa"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.822503 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qph9q" event={"ID":"6007cac9-0c7c-4d71-b65c-aa10735ecce4","Type":"ContainerStarted","Data":"098a7485ab11494e42ef4d9b47113949036007c4b7a1e4ede1144be84ab84da1"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.833968 4743 generic.go:334] "Generic (PLEG): container finished" podID="1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb" containerID="ae0d0402ed13139155062ac45dc82fa2a2324cd499029e6f419d1b9a4bf38668" exitCode=0 Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.836092 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" event={"ID":"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb","Type":"ContainerDied","Data":"ae0d0402ed13139155062ac45dc82fa2a2324cd499029e6f419d1b9a4bf38668"} Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.838653 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.840182 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.846560 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:04 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:04 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:04 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.847265 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:04 crc kubenswrapper[4743]: W1125 16:01:04.852144 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671bb022_0c33_494e_a398_9e980eb0d3fd.slice/crio-b3c68d85316376025c164beb9d1ca02c13f547d25c5e734c8d0302f2aba22d41 WatchSource:0}: Error finding container b3c68d85316376025c164beb9d1ca02c13f547d25c5e734c8d0302f2aba22d41: Status 404 returned error can't find the container with id b3c68d85316376025c164beb9d1ca02c13f547d25c5e734c8d0302f2aba22d41 Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.907273 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:04 crc kubenswrapper[4743]: E1125 16:01:04.907387 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:05.407353535 +0000 UTC m=+144.529193084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.907520 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:04 crc kubenswrapper[4743]: E1125 16:01:04.909483 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:05.409471305 +0000 UTC m=+144.531311174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:04 crc kubenswrapper[4743]: I1125 16:01:04.984293 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.012456 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:05 crc kubenswrapper[4743]: E1125 16:01:05.012879 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:05.51283153 +0000 UTC m=+144.634671079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.114760 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:05 crc kubenswrapper[4743]: E1125 16:01:05.115072 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:05.615058283 +0000 UTC m=+144.736897832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.158642 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-55g75" podStartSLOduration=121.158624601 podStartE2EDuration="2m1.158624601s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:05.156144551 +0000 UTC m=+144.277984110" watchObservedRunningTime="2025-11-25 16:01:05.158624601 +0000 UTC m=+144.280464150" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.201552 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" podStartSLOduration=121.200153162 podStartE2EDuration="2m1.200153162s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:05.193870384 +0000 UTC m=+144.315709933" watchObservedRunningTime="2025-11-25 16:01:05.200153162 +0000 UTC m=+144.321992721" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.215720 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:05 crc kubenswrapper[4743]: E1125 16:01:05.216000 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:05.715955848 +0000 UTC m=+144.837795547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.234651 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-4sghb" podStartSLOduration=121.234552732 podStartE2EDuration="2m1.234552732s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:05.228621705 +0000 UTC m=+144.350461264" watchObservedRunningTime="2025-11-25 16:01:05.234552732 +0000 UTC m=+144.356392281" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.273832 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-ldzq4" podStartSLOduration=121.273807159 podStartE2EDuration="2m1.273807159s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:05.272399829 +0000 UTC m=+144.394239388" watchObservedRunningTime="2025-11-25 16:01:05.273807159 +0000 UTC m=+144.395646708" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.306839 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" podStartSLOduration=121.30682154 podStartE2EDuration="2m1.30682154s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:05.303453025 +0000 UTC m=+144.425292584" watchObservedRunningTime="2025-11-25 16:01:05.30682154 +0000 UTC m=+144.428661089" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.319041 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:05 crc kubenswrapper[4743]: E1125 16:01:05.319435 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:05.819417576 +0000 UTC m=+144.941257135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.395162 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2" podStartSLOduration=121.395127561 podStartE2EDuration="2m1.395127561s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:05.390075907 +0000 UTC m=+144.511915476" watchObservedRunningTime="2025-11-25 16:01:05.395127561 +0000 UTC m=+144.516967110" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.422358 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:05 crc kubenswrapper[4743]: E1125 16:01:05.422905 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:05.922871763 +0000 UTC m=+145.044711312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.424687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:05 crc kubenswrapper[4743]: E1125 16:01:05.425293 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:05.92526729 +0000 UTC m=+145.047106839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.442854 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dmmkv" podStartSLOduration=121.442819506 podStartE2EDuration="2m1.442819506s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:05.434730077 +0000 UTC m=+144.556569656" watchObservedRunningTime="2025-11-25 16:01:05.442819506 +0000 UTC m=+144.564659055" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.472211 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" podStartSLOduration=121.472194184 podStartE2EDuration="2m1.472194184s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:05.470697852 +0000 UTC m=+144.592537421" watchObservedRunningTime="2025-11-25 16:01:05.472194184 +0000 UTC m=+144.594033733" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.511690 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-phb4q" podStartSLOduration=121.511667007 podStartE2EDuration="2m1.511667007s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:05.51141969 +0000 UTC m=+144.633259259" watchObservedRunningTime="2025-11-25 16:01:05.511667007 +0000 UTC m=+144.633506576" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.525887 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:05 crc kubenswrapper[4743]: E1125 16:01:05.526305 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:06.026285989 +0000 UTC m=+145.148125528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.581207 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-gwtwj" podStartSLOduration=121.581189007 podStartE2EDuration="2m1.581189007s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:05.581060164 +0000 UTC m=+144.702899733" watchObservedRunningTime="2025-11-25 16:01:05.581189007 +0000 UTC m=+144.703028556" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.627666 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:05 crc kubenswrapper[4743]: E1125 16:01:05.628043 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:06.128023658 +0000 UTC m=+145.249863207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.728806 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:05 crc kubenswrapper[4743]: E1125 16:01:05.729039 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:06.228985165 +0000 UTC m=+145.350824714 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.729339 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:05 crc kubenswrapper[4743]: E1125 16:01:05.729838 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:06.229812239 +0000 UTC m=+145.351651938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.831371 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:05 crc kubenswrapper[4743]: E1125 16:01:05.831933 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:06.331905468 +0000 UTC m=+145.453745027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.845801 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:05 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:05 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:05 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.846747 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.877783 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2gkgd" event={"ID":"e6f5cab6-00e5-44e0-8213-0754e21b2cb4","Type":"ContainerStarted","Data":"e0886f472bdb7e1e17274e4efdee607d5163b6a24cf20310492a4e96b27efa48"} Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.889492 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" event={"ID":"01b5ca28-1828-40b6-97cf-093f8027dab3","Type":"ContainerStarted","Data":"d3e77469b1f425f9cb0765b94f6cb4a5e669d36ad636330b0b09bbcf337240d0"} Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.921749 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55" event={"ID":"954c73bd-5c98-4909-a01b-f28ca7be011e","Type":"ContainerStarted","Data":"1b89dd76b52400fb0a9ca78df4ac2b18e924d1e0934aaa4a751d3eaddc8853fb"} Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.937729 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:05 crc kubenswrapper[4743]: E1125 16:01:05.939350 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:06.439329477 +0000 UTC m=+145.561169026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.940065 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76kq2" event={"ID":"3b2a5449-98a1-46dd-893d-7a7b8e5bf0de","Type":"ContainerStarted","Data":"0ec6a20af5a53068899f22975fb774b761c4b1082a77fe28ee56cc17d675356f"} Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.943196 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6mhkh" event={"ID":"a691b039-6ec5-4ba1-b64a-badfaaff730e","Type":"ContainerStarted","Data":"611098ed300e913c52e56fb049bee5e60f78807a8699d1b9513f55a8d9f07c72"} Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.951371 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2gkgd" podStartSLOduration=5.951346116 podStartE2EDuration="5.951346116s" podCreationTimestamp="2025-11-25 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:05.905564125 +0000 UTC m=+145.027403684" watchObservedRunningTime="2025-11-25 16:01:05.951346116 +0000 UTC m=+145.073185665" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.951812 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" event={"ID":"125e1f77-3db7-4893-8127-fcd74903a65b","Type":"ContainerStarted","Data":"2751b71ca142423382dbaef7a05e507acebde57cf735eb2c74a2982a5128c30a"} Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.953006 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.954782 4743 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-f5d99 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.954839 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" podUID="125e1f77-3db7-4893-8127-fcd74903a65b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.964846 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" event={"ID":"a6cf416d-49d9-4e74-846b-7d6923ae415f","Type":"ContainerStarted","Data":"2be373c606c6396a424f2b0b486369ad08777da98a4ef24ca7ae99e1c084834d"} Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.968915 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.973529 4743 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-cbr2m container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.973617 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" podUID="a6cf416d-49d9-4e74-846b-7d6923ae415f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.979463 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6mhkh" podStartSLOduration=121.979435338 podStartE2EDuration="2m1.979435338s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:05.970727053 +0000 UTC m=+145.092566622" watchObservedRunningTime="2025-11-25 16:01:05.979435338 +0000 UTC m=+145.101274887" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.979980 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" podStartSLOduration=65.979971244 podStartE2EDuration="1m5.979971244s" podCreationTimestamp="2025-11-25 16:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:05.94263534 +0000 UTC m=+145.064474909" watchObservedRunningTime="2025-11-25 16:01:05.979971244 +0000 UTC m=+145.101810793" Nov 25 16:01:05 crc kubenswrapper[4743]: I1125 16:01:05.989031 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh" event={"ID":"90a94872-5a77-4279-a045-55a6abe7a781","Type":"ContainerStarted","Data":"5c39bdbef5d35d8325320843c3b194c3bd1431b1d47712dac8468aa22e6a208a"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.005836 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" event={"ID":"a655856c-3900-4342-a094-dc03b84c8876","Type":"ContainerStarted","Data":"ded00171ee495227e87517f1d9cea5607222471b721eb32dd233a5266d5d35d6"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.013195 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9" event={"ID":"e7b1cd86-dbe1-4674-9159-2e9dbcc6182c","Type":"ContainerStarted","Data":"a6cb6e8cd15e212268ac82007df37d8dd08c35c25c6e5856f22df44b7138f900"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.026713 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" podStartSLOduration=122.02666982 podStartE2EDuration="2m2.02666982s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.003328942 +0000 UTC m=+145.125168511" watchObservedRunningTime="2025-11-25 16:01:06.02666982 +0000 UTC m=+145.148509389" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.026871 4743 generic.go:334] "Generic (PLEG): container finished" podID="50460f60-2828-42ab-94aa-3ae9d13a5a1e" containerID="edd9fc727c58dbd626ef910d96bf02cefd8885d2cd62a72993acddfa4d7b9381" exitCode=0 Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.027479 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" event={"ID":"50460f60-2828-42ab-94aa-3ae9d13a5a1e","Type":"ContainerDied","Data":"edd9fc727c58dbd626ef910d96bf02cefd8885d2cd62a72993acddfa4d7b9381"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.043575 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:06 crc kubenswrapper[4743]: E1125 16:01:06.044989 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:06.544973806 +0000 UTC m=+145.666813355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.047393 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" podStartSLOduration=122.047354304 podStartE2EDuration="2m2.047354304s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.031163997 +0000 UTC m=+145.153003556" watchObservedRunningTime="2025-11-25 16:01:06.047354304 +0000 UTC m=+145.169193853" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.049079 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8vr4f" event={"ID":"fb60121d-df03-4f88-a9e5-118105c6ce94","Type":"ContainerStarted","Data":"be6be2dac8138cdb204413c6baae8bd8e91dca34761045554c96eaae76ee5f19"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.049145 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8vr4f" event={"ID":"fb60121d-df03-4f88-a9e5-118105c6ce94","Type":"ContainerStarted","Data":"f281c785749079aa9cefca5e4ed0ee64cbb8b90a3895c43b3f70af976c1786cc"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.065870 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb" event={"ID":"d2185f4f-1936-4335-9aab-67538f5e3888","Type":"ContainerStarted","Data":"f173a865f171d65d849f81dba1c93560a0306f84c94ea8fa0fc09235d1668866"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.065943 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb" event={"ID":"d2185f4f-1936-4335-9aab-67538f5e3888","Type":"ContainerStarted","Data":"3fc2819765dc4004c2c1561ea033d470fcc4d339c2257eeeedb2dc58af24fb1d"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.067949 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xbdmm" event={"ID":"e411e757-a7ba-45fe-b248-8eca5d468049","Type":"ContainerStarted","Data":"14f8d99e587cbfc7528ad5e0044690d87f91fbc4afb3f2c8549da0134a78f38a"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.096220 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nqrqh" podStartSLOduration=122.096192271 podStartE2EDuration="2m2.096192271s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.089269856 +0000 UTC m=+145.211109405" watchObservedRunningTime="2025-11-25 16:01:06.096192271 +0000 UTC m=+145.218031810" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.096871 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-krv4b" event={"ID":"c3569861-ce7b-4e88-a5ec-77a5ea7e995e","Type":"ContainerStarted","Data":"3cfb73cec257a6894e22158dc67efef8d0edfbf14d5100a8b6e51ff37ad2ec3c"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.110794 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" event={"ID":"b022022e-00fd-43df-99b2-eb91f39e4264","Type":"ContainerStarted","Data":"718584ed81b88fd6fc87981d3666647667f920815f36bfdefda9032b3e29427c"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.111697 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" event={"ID":"b022022e-00fd-43df-99b2-eb91f39e4264","Type":"ContainerStarted","Data":"fa22f294a060d46dd56482cbc77b9c3b0bdcf70a28eaec179b918f851853afe7"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.118932 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8vr4f" podStartSLOduration=122.118893951 podStartE2EDuration="2m2.118893951s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.109085345 +0000 UTC m=+145.230924914" watchObservedRunningTime="2025-11-25 16:01:06.118893951 +0000 UTC m=+145.240733490" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.138297 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-58mwb" podStartSLOduration=122.138279158 podStartE2EDuration="2m2.138279158s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.131361403 +0000 UTC m=+145.253200962" watchObservedRunningTime="2025-11-25 16:01:06.138279158 +0000 UTC m=+145.260118697" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.150054 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.151192 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-xbdmm" podStartSLOduration=6.151136131 podStartE2EDuration="6.151136131s" podCreationTimestamp="2025-11-25 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.148984289 +0000 UTC m=+145.270823838" watchObservedRunningTime="2025-11-25 16:01:06.151136131 +0000 UTC m=+145.272975680" Nov 25 16:01:06 crc kubenswrapper[4743]: E1125 16:01:06.151979 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:06.651964154 +0000 UTC m=+145.773803703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.158852 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" event={"ID":"baeafcba-9592-4136-9893-4bf3b9295041","Type":"ContainerStarted","Data":"2d385007af954b65673249bd7d784298ec67db397fffe5fe87f6f14afe115541"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.160837 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.161145 4743 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pbz24 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.161199 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" podUID="baeafcba-9592-4136-9893-4bf3b9295041" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.183352 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-84rbl" event={"ID":"8421cc57-3b6e-44b6-a732-75cce2177e80","Type":"ContainerStarted","Data":"2693014a3706b6e29b4c80697f93e297894575414b087853e45246e79eb60e54"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.194998 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4wvzk" event={"ID":"150f435e-dbbf-4106-bc06-046dd7abb405","Type":"ContainerStarted","Data":"acf0a505a5ad74dc9e914d86efe5d339ceeea2c97334620f555f9e26bdf31b8a"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.199206 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" event={"ID":"c99ef725-44cc-4a3f-9ee1-2df041f7a254","Type":"ContainerStarted","Data":"5ec26aea07cef6a5c3fc163e83bce8f257bf206c2ce3c20e49ea40c39e3d0270"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.204869 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" event={"ID":"cffdcaea-9ca5-44de-bb0b-83d3c8d1da60","Type":"ContainerStarted","Data":"35ca4420bb3643697162f2029b4dce91c0baa5da9add4323ac758ab6f30f5d0f"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.210284 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5xxzs" event={"ID":"671bb022-0c33-494e-a398-9e980eb0d3fd","Type":"ContainerStarted","Data":"b3c68d85316376025c164beb9d1ca02c13f547d25c5e734c8d0302f2aba22d41"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.222193 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" event={"ID":"7d64e696-446f-4e5d-a276-4a9f18f291b2","Type":"ContainerStarted","Data":"d53dd64bb90ae67b87a53edb8154db9a0c5135f38b9cfc110b60c67182117f71"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.227988 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk" event={"ID":"f83d7b2d-d239-4476-b094-929bce4f8b20","Type":"ContainerStarted","Data":"da9c57d627922c8a7171f4334f9238bcae0859e3b52754b0ab038aad183a2a3a"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.228035 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk" event={"ID":"f83d7b2d-d239-4476-b094-929bce4f8b20","Type":"ContainerStarted","Data":"7081f0fcd121923c1b5f892b586e47a884ffa0b1f1203aebb78729f5e29472d5"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.228231 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" podStartSLOduration=122.228217484 podStartE2EDuration="2m2.228217484s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.183179464 +0000 UTC m=+145.305019043" watchObservedRunningTime="2025-11-25 16:01:06.228217484 +0000 UTC m=+145.350057033" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.229097 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-4wvzk" podStartSLOduration=122.229088459 podStartE2EDuration="2m2.229088459s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.225100536 +0000 UTC m=+145.346940085" watchObservedRunningTime="2025-11-25 16:01:06.229088459 +0000 UTC m=+145.350928028" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.233560 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zjh85" event={"ID":"8c219812-f1dd-44da-9a23-764167668a0f","Type":"ContainerStarted","Data":"409244c4b824b7ba514efadaad451e337a5946cd5e8ba5d4996189ed64695f0f"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.234483 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.235356 4743 patch_prober.go:28] interesting pod/console-operator-58897d9998-zjh85 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.235390 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zjh85" podUID="8c219812-f1dd-44da-9a23-764167668a0f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.245545 4743 generic.go:334] "Generic (PLEG): container finished" podID="95c66716-8eaa-4e63-a30d-5b871fffd090" containerID="a38d79159370fbff869f19513bbbccef19e93ad27cb27f4c937a93cef600db82" exitCode=0 Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.245635 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" event={"ID":"95c66716-8eaa-4e63-a30d-5b871fffd090","Type":"ContainerDied","Data":"a38d79159370fbff869f19513bbbccef19e93ad27cb27f4c937a93cef600db82"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.246257 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-n2vkg" podStartSLOduration=122.246246223 podStartE2EDuration="2m2.246246223s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.243512006 +0000 UTC m=+145.365351565" watchObservedRunningTime="2025-11-25 16:01:06.246246223 +0000 UTC m=+145.368085772" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.249437 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6" event={"ID":"bae5d0d6-18e7-40ac-a0c1-bd7e5fa8e687","Type":"ContainerStarted","Data":"04470f3c39ca1b456675c47bf7caa18804b73a2f5bad3e6a7da5ab4e1528fa1f"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.251166 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:06 crc kubenswrapper[4743]: E1125 16:01:06.252024 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:06.752002934 +0000 UTC m=+145.873842553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.266265 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" podStartSLOduration=122.266245956 podStartE2EDuration="2m2.266245956s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.264744354 +0000 UTC m=+145.386583903" watchObservedRunningTime="2025-11-25 16:01:06.266245956 +0000 UTC m=+145.388085505" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.270797 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qph9q" event={"ID":"6007cac9-0c7c-4d71-b65c-aa10735ecce4","Type":"ContainerStarted","Data":"cf938ca6cf5b32c3618f8e161a7a73b0ecb59adc10e704eefd01c16090b2f1ee"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.277170 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" event={"ID":"a231a0eb-9935-44d2-abf0-733aa2d944a6","Type":"ContainerStarted","Data":"abad45f2bef43669a7ff1a52a0c6580da2eb709b815a2857de23bf0b41fd5775"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.277917 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.290360 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l4t62" podStartSLOduration=122.290345957 podStartE2EDuration="2m2.290345957s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.289303287 +0000 UTC m=+145.411142836" watchObservedRunningTime="2025-11-25 16:01:06.290345957 +0000 UTC m=+145.412185506" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.292535 4743 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4v47p container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" start-of-body= Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.292569 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" podUID="a231a0eb-9935-44d2-abf0-733aa2d944a6" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.298699 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bl29m" event={"ID":"63c6e7d6-50c6-4a7e-b318-edbfa0496006","Type":"ContainerStarted","Data":"a26096393c45e5d4bc0d3272e99e548f7588ed01a96e0452c3c13f9b042f49b3"} Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.300289 4743 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-w8pxz container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" start-of-body= Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.300388 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.301479 4743 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5n9pg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.301522 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" podUID="2e800807-1cef-4dcb-9001-48322127beb9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.302439 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-55g75 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.302468 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-55g75" podUID="2b4b5943-89e2-483d-a034-1344fec03f98" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.344416 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qph9q" podStartSLOduration=122.34439478 podStartE2EDuration="2m2.34439478s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.332565497 +0000 UTC m=+145.454405046" watchObservedRunningTime="2025-11-25 16:01:06.34439478 +0000 UTC m=+145.466234329" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.355053 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:06 crc kubenswrapper[4743]: E1125 16:01:06.365877 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:06.865857836 +0000 UTC m=+145.987697475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.374979 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.377779 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-zjh85" podStartSLOduration=122.377760082 podStartE2EDuration="2m2.377760082s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.37309912 +0000 UTC m=+145.494938679" watchObservedRunningTime="2025-11-25 16:01:06.377760082 +0000 UTC m=+145.499599631" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.442079 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7j4pk" podStartSLOduration=122.442056575 podStartE2EDuration="2m2.442056575s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.399228287 +0000 UTC m=+145.521067846" watchObservedRunningTime="2025-11-25 16:01:06.442056575 +0000 UTC m=+145.563896144" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.442918 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-swbj6" podStartSLOduration=122.442912699 podStartE2EDuration="2m2.442912699s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.427549326 +0000 UTC m=+145.549388885" watchObservedRunningTime="2025-11-25 16:01:06.442912699 +0000 UTC m=+145.564752248" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.452285 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" podStartSLOduration=122.452266923 podStartE2EDuration="2m2.452266923s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:06.452020376 +0000 UTC m=+145.573859945" watchObservedRunningTime="2025-11-25 16:01:06.452266923 +0000 UTC m=+145.574106472" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.472040 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:06 crc kubenswrapper[4743]: E1125 16:01:06.472654 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:06.972605207 +0000 UTC m=+146.094444756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.574070 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:06 crc kubenswrapper[4743]: E1125 16:01:06.574509 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.074493499 +0000 UTC m=+146.196333058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.674955 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:06 crc kubenswrapper[4743]: E1125 16:01:06.675453 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.175421765 +0000 UTC m=+146.297261314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.777185 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:06 crc kubenswrapper[4743]: E1125 16:01:06.777536 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.277520365 +0000 UTC m=+146.399359914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.842403 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:06 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:06 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:06 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.842472 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.878033 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:06 crc kubenswrapper[4743]: E1125 16:01:06.878218 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.378190854 +0000 UTC m=+146.500030403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.878269 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:06 crc kubenswrapper[4743]: E1125 16:01:06.878762 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.37875464 +0000 UTC m=+146.500594189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.979634 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:06 crc kubenswrapper[4743]: E1125 16:01:06.979827 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.47980238 +0000 UTC m=+146.601641929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:06 crc kubenswrapper[4743]: I1125 16:01:06.979960 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:06 crc kubenswrapper[4743]: E1125 16:01:06.980326 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.480315834 +0000 UTC m=+146.602155383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.081148 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:07 crc kubenswrapper[4743]: E1125 16:01:07.081348 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.581321183 +0000 UTC m=+146.703160732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.081401 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:07 crc kubenswrapper[4743]: E1125 16:01:07.081775 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.581765875 +0000 UTC m=+146.703605424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.182430 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:07 crc kubenswrapper[4743]: E1125 16:01:07.182587 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.682565048 +0000 UTC m=+146.804404597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.182696 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:07 crc kubenswrapper[4743]: E1125 16:01:07.183016 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.683006731 +0000 UTC m=+146.804846280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.283851 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:07 crc kubenswrapper[4743]: E1125 16:01:07.284061 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.78402754 +0000 UTC m=+146.905867099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.284309 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:07 crc kubenswrapper[4743]: E1125 16:01:07.284730 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.784720839 +0000 UTC m=+146.906560568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.304069 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" event={"ID":"a655856c-3900-4342-a094-dc03b84c8876","Type":"ContainerStarted","Data":"92860cb1c8d948283233a3056d29d0da68c324836647177a79e27e33f624b1e2"} Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.320517 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" event={"ID":"b022022e-00fd-43df-99b2-eb91f39e4264","Type":"ContainerStarted","Data":"3e2d9778ac20b2cc06e9d7e9756638b867546b52824e7622ba56dc297cda10e8"} Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.326260 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5xxzs" event={"ID":"671bb022-0c33-494e-a398-9e980eb0d3fd","Type":"ContainerStarted","Data":"5bf727679aa999524c27b63b382f86447ed917a65c8f06e7f09ea2f89bda712a"} Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.326301 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5xxzs" event={"ID":"671bb022-0c33-494e-a398-9e980eb0d3fd","Type":"ContainerStarted","Data":"6835a61a872b5f2abed750d486cad886f66364c9f8584576da223de335c2648f"} Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.326398 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-5xxzs" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.328634 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-j9hhf" podStartSLOduration=123.328607697 podStartE2EDuration="2m3.328607697s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:07.324275874 +0000 UTC m=+146.446115443" watchObservedRunningTime="2025-11-25 16:01:07.328607697 +0000 UTC m=+146.450447246" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.332698 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9" event={"ID":"e7b1cd86-dbe1-4674-9159-2e9dbcc6182c","Type":"ContainerStarted","Data":"9197a3dfdd6c3bfd636b1e6f00ae27ee9d6c4d6d6f07ab3e65948e231f7cdba1"} Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.332768 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9" event={"ID":"e7b1cd86-dbe1-4674-9159-2e9dbcc6182c","Type":"ContainerStarted","Data":"363057aba399763dc539aacae11f3590d501e7e30d4dd0fd19a0d97e938f141d"} Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.335993 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" event={"ID":"50460f60-2828-42ab-94aa-3ae9d13a5a1e","Type":"ContainerStarted","Data":"ee7eb66b0aa08bb0da59aefb5661e6eddf19bb24687d0e6ceb4ace8563cf9108"} Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.348049 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4dtjj" podStartSLOduration=123.348025924 podStartE2EDuration="2m3.348025924s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:07.347567431 +0000 UTC m=+146.469407010" watchObservedRunningTime="2025-11-25 16:01:07.348025924 +0000 UTC m=+146.469865473" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.355691 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" event={"ID":"1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb","Type":"ContainerStarted","Data":"52e742ea725f98c98b2bc75fb30b55c5e0f84f878cdf89125fb36297c88753e3"} Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.373392 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" event={"ID":"95c66716-8eaa-4e63-a30d-5b871fffd090","Type":"ContainerStarted","Data":"a86cbc687917f3957938ec8bf8e95556125f2c9ef826b3ee30e7ab347093dfa5"} Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.373585 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.385609 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:07 crc kubenswrapper[4743]: E1125 16:01:07.385711 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.885689047 +0000 UTC m=+147.007528596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.385873 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:07 crc kubenswrapper[4743]: E1125 16:01:07.386137 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.886129759 +0000 UTC m=+147.007969308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.386991 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.387404 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.392287 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5xxzs" podStartSLOduration=7.392267672 podStartE2EDuration="7.392267672s" podCreationTimestamp="2025-11-25 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:07.390005508 +0000 UTC m=+146.511845087" watchObservedRunningTime="2025-11-25 16:01:07.392267672 +0000 UTC m=+146.514107221" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.400224 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-84rbl" event={"ID":"8421cc57-3b6e-44b6-a732-75cce2177e80","Type":"ContainerStarted","Data":"6b0598eee9d247f4f8d866703112bfb2b143e1d2eb977e3f3396c4e0d3f8d989"} Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.400546 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-84rbl" event={"ID":"8421cc57-3b6e-44b6-a732-75cce2177e80","Type":"ContainerStarted","Data":"8d3d9e4ed19c6aba92fdff46f0dda35cdfa92581c72baf2cd9efd70fbc61f49e"} Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.408636 4743 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-t6sfd container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.13:8443/livez\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.409050 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" podUID="1e779fa4-6294-4d8b-9c70-0cfc6f9fa9fb" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.13:8443/livez\": dial tcp 10.217.0.13:8443: connect: connection refused" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.420329 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2h6h9" podStartSLOduration=123.420308423 podStartE2EDuration="2m3.420308423s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:07.406176894 +0000 UTC m=+146.528016463" watchObservedRunningTime="2025-11-25 16:01:07.420308423 +0000 UTC m=+146.542147982" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.431411 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqk6q" event={"ID":"cffdcaea-9ca5-44de-bb0b-83d3c8d1da60","Type":"ContainerStarted","Data":"30ea409c8dc31753b91355b5758c82e61726226a7742a400f89b951c1e4c70f4"} Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.456352 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-krv4b" event={"ID":"c3569861-ce7b-4e88-a5ec-77a5ea7e995e","Type":"ContainerStarted","Data":"49579c890755952e244942653dd7b22a29fbb614dfa29be25e716be1649c100a"} Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.462983 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55" event={"ID":"954c73bd-5c98-4909-a01b-f28ca7be011e","Type":"ContainerStarted","Data":"1aa46787314abfe7fe625dfd668e4e97a5b96b99ebadf9939aa978cc689a758c"} Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.463665 4743 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-cbr2m container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.463757 4743 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4v47p container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" start-of-body= Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.463780 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" podUID="a231a0eb-9935-44d2-abf0-733aa2d944a6" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.463754 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" podUID="a6cf416d-49d9-4e74-846b-7d6923ae415f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.464361 4743 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-f5d99 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.464404 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" podUID="125e1f77-3db7-4893-8127-fcd74903a65b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.464703 4743 patch_prober.go:28] interesting pod/console-operator-58897d9998-zjh85 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.464753 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zjh85" podUID="8c219812-f1dd-44da-9a23-764167668a0f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.466396 4743 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pbz24 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.466434 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" podUID="baeafcba-9592-4136-9893-4bf3b9295041" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.474654 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.479837 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.493382 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:07 crc kubenswrapper[4743]: E1125 16:01:07.495078 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:07.99505276 +0000 UTC m=+147.116892309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.496447 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-84rbl" podStartSLOduration=123.496408499 podStartE2EDuration="2m3.496408499s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:07.44678025 +0000 UTC m=+146.568619809" watchObservedRunningTime="2025-11-25 16:01:07.496408499 +0000 UTC m=+146.618248048" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.496794 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" podStartSLOduration=123.49678808 podStartE2EDuration="2m3.49678808s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:07.496171692 +0000 UTC m=+146.618011251" watchObservedRunningTime="2025-11-25 16:01:07.49678808 +0000 UTC m=+146.618627639" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.539843 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" podStartSLOduration=123.539828843 podStartE2EDuration="2m3.539828843s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:07.537342043 +0000 UTC m=+146.659181592" watchObservedRunningTime="2025-11-25 16:01:07.539828843 +0000 UTC m=+146.661668392" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.556620 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55" podStartSLOduration=123.556584665 podStartE2EDuration="2m3.556584665s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:07.554841746 +0000 UTC m=+146.676681325" watchObservedRunningTime="2025-11-25 16:01:07.556584665 +0000 UTC m=+146.678424214" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.596696 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.607185 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-krv4b" podStartSLOduration=123.607165092 podStartE2EDuration="2m3.607165092s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:07.606494304 +0000 UTC m=+146.728333873" watchObservedRunningTime="2025-11-25 16:01:07.607165092 +0000 UTC m=+146.729004641" Nov 25 16:01:07 crc kubenswrapper[4743]: E1125 16:01:07.610936 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.110918548 +0000 UTC m=+147.232758097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.699140 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:07 crc kubenswrapper[4743]: E1125 16:01:07.699514 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.199497626 +0000 UTC m=+147.321337175 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.800448 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:07 crc kubenswrapper[4743]: E1125 16:01:07.800930 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.300912826 +0000 UTC m=+147.422752375 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.838548 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:07 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:07 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:07 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.838646 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.901684 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:07 crc kubenswrapper[4743]: E1125 16:01:07.901849 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.401816422 +0000 UTC m=+147.523655971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:07 crc kubenswrapper[4743]: I1125 16:01:07.901939 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:07 crc kubenswrapper[4743]: E1125 16:01:07.902440 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.402427619 +0000 UTC m=+147.524267168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.003772 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.003981 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.503956373 +0000 UTC m=+147.625795922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.104855 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.105193 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.605179687 +0000 UTC m=+147.727019226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.205627 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.205698 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.705666241 +0000 UTC m=+147.827505800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.206014 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.206373 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.70636336 +0000 UTC m=+147.828202909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.306645 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.306856 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.806825934 +0000 UTC m=+147.928665483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.307027 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.307387 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.807376069 +0000 UTC m=+147.929215668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.409098 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.409497 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:08.909476629 +0000 UTC m=+148.031316178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.471618 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" event={"ID":"50460f60-2828-42ab-94aa-3ae9d13a5a1e","Type":"ContainerStarted","Data":"0aee7fb507f4a7c1687425e4cf8aae6f32ae4ffd07b1221f1c7f23ff852c314e"} Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.471659 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.472548 4743 patch_prober.go:28] interesting pod/console-operator-58897d9998-zjh85 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.472582 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zjh85" podUID="8c219812-f1dd-44da-9a23-764167668a0f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.473293 4743 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-f5d99 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.473317 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" podUID="125e1f77-3db7-4893-8127-fcd74903a65b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.511241 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.512268 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:09.012257207 +0000 UTC m=+148.134096756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.514667 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" podStartSLOduration=124.514645664 podStartE2EDuration="2m4.514645664s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:08.512495764 +0000 UTC m=+147.634335333" watchObservedRunningTime="2025-11-25 16:01:08.514645664 +0000 UTC m=+147.636485223" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.525212 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.612645 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.612858 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:09.112826253 +0000 UTC m=+148.234665802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.613230 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.617285 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:09.117260159 +0000 UTC m=+148.239099908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.714324 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.714494 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:09.21446643 +0000 UTC m=+148.336305979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.714576 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.714645 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.714674 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.714702 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.714724 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.715095 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:09.215074388 +0000 UTC m=+148.336913937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.715792 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.730721 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.740022 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.740098 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.816256 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.816411 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:09.316389924 +0000 UTC m=+148.438229473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.816476 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.816818 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:09.316807516 +0000 UTC m=+148.438647065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.845334 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:08 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:08 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:08 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.845407 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.887629 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.894664 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.917627 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.917856 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:09.417806854 +0000 UTC m=+148.539646403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:08 crc kubenswrapper[4743]: I1125 16:01:08.917929 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:08 crc kubenswrapper[4743]: E1125 16:01:08.918553 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:09.418544445 +0000 UTC m=+148.540383994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.001909 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.019098 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:09 crc kubenswrapper[4743]: E1125 16:01:09.019337 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:09.519276366 +0000 UTC m=+148.641115915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.019409 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:09 crc kubenswrapper[4743]: E1125 16:01:09.019820 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:09.519802051 +0000 UTC m=+148.641641780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.122629 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:09 crc kubenswrapper[4743]: E1125 16:01:09.123166 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:09.623142995 +0000 UTC m=+148.744982544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.225319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:09 crc kubenswrapper[4743]: E1125 16:01:09.225856 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:09.725827681 +0000 UTC m=+148.847667230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.334992 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:09 crc kubenswrapper[4743]: E1125 16:01:09.336152 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:09.836135692 +0000 UTC m=+148.957975241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:09 crc kubenswrapper[4743]: W1125 16:01:09.360854 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-63dca8b7bb1c3be5dc9b1bbc1b9d69c21c2a6f817943cb9911b9482cc01a4ec5 WatchSource:0}: Error finding container 63dca8b7bb1c3be5dc9b1bbc1b9d69c21c2a6f817943cb9911b9482cc01a4ec5: Status 404 returned error can't find the container with id 63dca8b7bb1c3be5dc9b1bbc1b9d69c21c2a6f817943cb9911b9482cc01a4ec5 Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.437382 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:09 crc kubenswrapper[4743]: E1125 16:01:09.437887 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:09.937861931 +0000 UTC m=+149.059701490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.475774 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"90edd66900888643488a24e5ca1bab751dbba3b4b91bd44fd0a1ed83095e5213"} Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.476774 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"63dca8b7bb1c3be5dc9b1bbc1b9d69c21c2a6f817943cb9911b9482cc01a4ec5"} Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.478073 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0f6281335f33d35f702ec2948b06962510727c26aad540f001055c4ba60232b2"} Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.538863 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:09 crc kubenswrapper[4743]: E1125 16:01:09.539076 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:10.039041854 +0000 UTC m=+149.160881413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.539672 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:09 crc kubenswrapper[4743]: E1125 16:01:09.541897 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:10.041869524 +0000 UTC m=+149.163709273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.640986 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:09 crc kubenswrapper[4743]: E1125 16:01:09.641492 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:10.141466403 +0000 UTC m=+149.263305942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.742707 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:09 crc kubenswrapper[4743]: E1125 16:01:09.743207 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:10.243187971 +0000 UTC m=+149.365027520 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.843272 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:09 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:09 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:09 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.843561 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.844104 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:09 crc kubenswrapper[4743]: E1125 16:01:09.844305 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:10.344271382 +0000 UTC m=+149.466110931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.844480 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:09 crc kubenswrapper[4743]: E1125 16:01:09.844910 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:10.34490126 +0000 UTC m=+149.466740809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:09 crc kubenswrapper[4743]: I1125 16:01:09.945575 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:09 crc kubenswrapper[4743]: E1125 16:01:09.945961 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:10.445933809 +0000 UTC m=+149.567773358 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.048305 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:10 crc kubenswrapper[4743]: E1125 16:01:10.048762 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:10.548741888 +0000 UTC m=+149.670581437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.148988 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:10 crc kubenswrapper[4743]: E1125 16:01:10.149313 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:10.649266253 +0000 UTC m=+149.771105812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.149749 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:10 crc kubenswrapper[4743]: E1125 16:01:10.150287 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:10.650275682 +0000 UTC m=+149.772115421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.250429 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:10 crc kubenswrapper[4743]: E1125 16:01:10.250607 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:10.75056485 +0000 UTC m=+149.872404419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.251757 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:10 crc kubenswrapper[4743]: E1125 16:01:10.252111 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:10.752099794 +0000 UTC m=+149.873939353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.353369 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:10 crc kubenswrapper[4743]: E1125 16:01:10.353773 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:10.85375278 +0000 UTC m=+149.975592329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.455172 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:10 crc kubenswrapper[4743]: E1125 16:01:10.455539 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:10.95552412 +0000 UTC m=+150.077363669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.503725 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"18c978fadb1659fa4368f4d27cbf5121d50628cbd5db0a69882adbc7c46460ca"} Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.521101 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6fbc7afd777bf468bfd3581dc16d3eda40b3724e151f3f372f6604d344158186"} Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.546092 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bl29m" event={"ID":"63c6e7d6-50c6-4a7e-b318-edbfa0496006","Type":"ContainerStarted","Data":"d10462916ef1d3a732890a08d4ed7333ac8a71530ac483b87cdae07aedee4899"} Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.558090 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:10 crc kubenswrapper[4743]: E1125 16:01:10.559024 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.059003629 +0000 UTC m=+150.180843188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.574701 4743 generic.go:334] "Generic (PLEG): container finished" podID="01b5ca28-1828-40b6-97cf-093f8027dab3" containerID="d3e77469b1f425f9cb0765b94f6cb4a5e669d36ad636330b0b09bbcf337240d0" exitCode=0 Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.575166 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" event={"ID":"01b5ca28-1828-40b6-97cf-093f8027dab3","Type":"ContainerDied","Data":"d3e77469b1f425f9cb0765b94f6cb4a5e669d36ad636330b0b09bbcf337240d0"} Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.577176 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"bd44d27aaaf93178a068d3f16ff8aeb9f1e291b005b9abd271695225e1020f83"} Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.577705 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.660307 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:10 crc kubenswrapper[4743]: E1125 16:01:10.660878 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.160860332 +0000 UTC m=+150.282699951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.761141 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:10 crc kubenswrapper[4743]: E1125 16:01:10.761346 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.261315534 +0000 UTC m=+150.383155093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.761399 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:10 crc kubenswrapper[4743]: E1125 16:01:10.761802 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.261790868 +0000 UTC m=+150.383630487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.839614 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:10 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:10 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:10 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.839679 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.862450 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:10 crc kubenswrapper[4743]: E1125 16:01:10.862716 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.362664073 +0000 UTC m=+150.484503822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.862836 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:10 crc kubenswrapper[4743]: E1125 16:01:10.863311 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.36329635 +0000 UTC m=+150.485136079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.963536 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:10 crc kubenswrapper[4743]: E1125 16:01:10.963713 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.463688771 +0000 UTC m=+150.585528320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:10 crc kubenswrapper[4743]: I1125 16:01:10.963764 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:10 crc kubenswrapper[4743]: E1125 16:01:10.964100 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.464088882 +0000 UTC m=+150.585928431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.064861 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:11 crc kubenswrapper[4743]: E1125 16:01:11.065197 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.565179833 +0000 UTC m=+150.687019382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.091907 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.092484 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.099119 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.106802 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.107222 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.166771 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.167084 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17b96ea6-b553-46a9-8433-c8259de8b661-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"17b96ea6-b553-46a9-8433-c8259de8b661\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.167156 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17b96ea6-b553-46a9-8433-c8259de8b661-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"17b96ea6-b553-46a9-8433-c8259de8b661\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 16:01:11 crc kubenswrapper[4743]: E1125 16:01:11.167208 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.667190581 +0000 UTC m=+150.789030130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.184536 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qsq8z"] Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.185456 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.197331 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.212730 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qsq8z"] Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.269097 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.269279 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjfjb\" (UniqueName: \"kubernetes.io/projected/a3cb9b00-75c4-4da6-b6f6-8b92726febae-kube-api-access-tjfjb\") pod \"community-operators-qsq8z\" (UID: \"a3cb9b00-75c4-4da6-b6f6-8b92726febae\") " pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.269335 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17b96ea6-b553-46a9-8433-c8259de8b661-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"17b96ea6-b553-46a9-8433-c8259de8b661\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 16:01:11 crc kubenswrapper[4743]: E1125 16:01:11.269417 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.769395773 +0000 UTC m=+150.891235322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.269480 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17b96ea6-b553-46a9-8433-c8259de8b661-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"17b96ea6-b553-46a9-8433-c8259de8b661\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.269533 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.269562 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17b96ea6-b553-46a9-8433-c8259de8b661-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"17b96ea6-b553-46a9-8433-c8259de8b661\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.269581 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cb9b00-75c4-4da6-b6f6-8b92726febae-utilities\") pod \"community-operators-qsq8z\" (UID: \"a3cb9b00-75c4-4da6-b6f6-8b92726febae\") " pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.269615 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cb9b00-75c4-4da6-b6f6-8b92726febae-catalog-content\") pod \"community-operators-qsq8z\" (UID: \"a3cb9b00-75c4-4da6-b6f6-8b92726febae\") " pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:01:11 crc kubenswrapper[4743]: E1125 16:01:11.270337 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.770325729 +0000 UTC m=+150.892165278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.296103 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17b96ea6-b553-46a9-8433-c8259de8b661-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"17b96ea6-b553-46a9-8433-c8259de8b661\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.317317 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-46wjh"] Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.319110 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.323652 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.331183 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-46wjh"] Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.370946 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:11 crc kubenswrapper[4743]: E1125 16:01:11.371097 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.8710766 +0000 UTC m=+150.992916159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.371476 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjfjb\" (UniqueName: \"kubernetes.io/projected/a3cb9b00-75c4-4da6-b6f6-8b92726febae-kube-api-access-tjfjb\") pod \"community-operators-qsq8z\" (UID: \"a3cb9b00-75c4-4da6-b6f6-8b92726febae\") " pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.371556 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.371620 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cb9b00-75c4-4da6-b6f6-8b92726febae-catalog-content\") pod \"community-operators-qsq8z\" (UID: \"a3cb9b00-75c4-4da6-b6f6-8b92726febae\") " pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.371643 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cb9b00-75c4-4da6-b6f6-8b92726febae-utilities\") pod \"community-operators-qsq8z\" (UID: \"a3cb9b00-75c4-4da6-b6f6-8b92726febae\") " pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.372121 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cb9b00-75c4-4da6-b6f6-8b92726febae-utilities\") pod \"community-operators-qsq8z\" (UID: \"a3cb9b00-75c4-4da6-b6f6-8b92726febae\") " pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:01:11 crc kubenswrapper[4743]: E1125 16:01:11.372905 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.872892101 +0000 UTC m=+150.994731650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.372996 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cb9b00-75c4-4da6-b6f6-8b92726febae-catalog-content\") pod \"community-operators-qsq8z\" (UID: \"a3cb9b00-75c4-4da6-b6f6-8b92726febae\") " pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.393403 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjfjb\" (UniqueName: \"kubernetes.io/projected/a3cb9b00-75c4-4da6-b6f6-8b92726febae-kube-api-access-tjfjb\") pod \"community-operators-qsq8z\" (UID: \"a3cb9b00-75c4-4da6-b6f6-8b92726febae\") " pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.419409 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.472677 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:11 crc kubenswrapper[4743]: E1125 16:01:11.472892 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.972859591 +0000 UTC m=+151.094699140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.473152 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.473211 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/052deadf-ed22-4688-a6fe-0b1039308499-catalog-content\") pod \"certified-operators-46wjh\" (UID: \"052deadf-ed22-4688-a6fe-0b1039308499\") " pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.473236 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krsqz\" (UniqueName: \"kubernetes.io/projected/052deadf-ed22-4688-a6fe-0b1039308499-kube-api-access-krsqz\") pod \"certified-operators-46wjh\" (UID: \"052deadf-ed22-4688-a6fe-0b1039308499\") " pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.473302 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/052deadf-ed22-4688-a6fe-0b1039308499-utilities\") pod \"certified-operators-46wjh\" (UID: \"052deadf-ed22-4688-a6fe-0b1039308499\") " pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:01:11 crc kubenswrapper[4743]: E1125 16:01:11.473505 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:11.973495178 +0000 UTC m=+151.095334727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.498376 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4hd5b"] Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.499520 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.501470 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.510229 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4hd5b"] Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.575238 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.575802 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a8b728-0425-424c-af4f-5c9a3c20ff4c-catalog-content\") pod \"community-operators-4hd5b\" (UID: \"96a8b728-0425-424c-af4f-5c9a3c20ff4c\") " pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.575832 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/052deadf-ed22-4688-a6fe-0b1039308499-utilities\") pod \"certified-operators-46wjh\" (UID: \"052deadf-ed22-4688-a6fe-0b1039308499\") " pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.575875 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6knp8\" (UniqueName: \"kubernetes.io/projected/96a8b728-0425-424c-af4f-5c9a3c20ff4c-kube-api-access-6knp8\") pod \"community-operators-4hd5b\" (UID: \"96a8b728-0425-424c-af4f-5c9a3c20ff4c\") " pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.575922 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/052deadf-ed22-4688-a6fe-0b1039308499-catalog-content\") pod \"certified-operators-46wjh\" (UID: \"052deadf-ed22-4688-a6fe-0b1039308499\") " pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.575942 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krsqz\" (UniqueName: \"kubernetes.io/projected/052deadf-ed22-4688-a6fe-0b1039308499-kube-api-access-krsqz\") pod \"certified-operators-46wjh\" (UID: \"052deadf-ed22-4688-a6fe-0b1039308499\") " pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.575968 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a8b728-0425-424c-af4f-5c9a3c20ff4c-utilities\") pod \"community-operators-4hd5b\" (UID: \"96a8b728-0425-424c-af4f-5c9a3c20ff4c\") " pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:01:11 crc kubenswrapper[4743]: E1125 16:01:11.576065 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:12.076051261 +0000 UTC m=+151.197890800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.576416 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/052deadf-ed22-4688-a6fe-0b1039308499-utilities\") pod \"certified-operators-46wjh\" (UID: \"052deadf-ed22-4688-a6fe-0b1039308499\") " pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.576717 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/052deadf-ed22-4688-a6fe-0b1039308499-catalog-content\") pod \"certified-operators-46wjh\" (UID: \"052deadf-ed22-4688-a6fe-0b1039308499\") " pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.607406 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krsqz\" (UniqueName: \"kubernetes.io/projected/052deadf-ed22-4688-a6fe-0b1039308499-kube-api-access-krsqz\") pod \"certified-operators-46wjh\" (UID: \"052deadf-ed22-4688-a6fe-0b1039308499\") " pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.610084 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bl29m" event={"ID":"63c6e7d6-50c6-4a7e-b318-edbfa0496006","Type":"ContainerStarted","Data":"881cb18986815fd84c707ed5dda3b23bc941d0cf8a3096b86e4ff927f657695f"} Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.646261 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.680047 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6knp8\" (UniqueName: \"kubernetes.io/projected/96a8b728-0425-424c-af4f-5c9a3c20ff4c-kube-api-access-6knp8\") pod \"community-operators-4hd5b\" (UID: \"96a8b728-0425-424c-af4f-5c9a3c20ff4c\") " pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.680149 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.680239 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a8b728-0425-424c-af4f-5c9a3c20ff4c-utilities\") pod \"community-operators-4hd5b\" (UID: \"96a8b728-0425-424c-af4f-5c9a3c20ff4c\") " pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.680288 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a8b728-0425-424c-af4f-5c9a3c20ff4c-catalog-content\") pod \"community-operators-4hd5b\" (UID: \"96a8b728-0425-424c-af4f-5c9a3c20ff4c\") " pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.681565 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a8b728-0425-424c-af4f-5c9a3c20ff4c-catalog-content\") pod \"community-operators-4hd5b\" (UID: \"96a8b728-0425-424c-af4f-5c9a3c20ff4c\") " pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.682440 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a8b728-0425-424c-af4f-5c9a3c20ff4c-utilities\") pod \"community-operators-4hd5b\" (UID: \"96a8b728-0425-424c-af4f-5c9a3c20ff4c\") " pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:01:11 crc kubenswrapper[4743]: E1125 16:01:11.682779 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:12.182764111 +0000 UTC m=+151.304603660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.698992 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mg9h4"] Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.714456 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6knp8\" (UniqueName: \"kubernetes.io/projected/96a8b728-0425-424c-af4f-5c9a3c20ff4c-kube-api-access-6knp8\") pod \"community-operators-4hd5b\" (UID: \"96a8b728-0425-424c-af4f-5c9a3c20ff4c\") " pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.720751 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mg9h4"] Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.720880 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.744825 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.797504 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.801492 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23344e84-51b1-4a70-8e06-77aa57659f16-utilities\") pod \"certified-operators-mg9h4\" (UID: \"23344e84-51b1-4a70-8e06-77aa57659f16\") " pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.801674 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23344e84-51b1-4a70-8e06-77aa57659f16-catalog-content\") pod \"certified-operators-mg9h4\" (UID: \"23344e84-51b1-4a70-8e06-77aa57659f16\") " pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.801739 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nf5d\" (UniqueName: \"kubernetes.io/projected/23344e84-51b1-4a70-8e06-77aa57659f16-kube-api-access-9nf5d\") pod \"certified-operators-mg9h4\" (UID: \"23344e84-51b1-4a70-8e06-77aa57659f16\") " pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:01:11 crc kubenswrapper[4743]: E1125 16:01:11.802146 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:12.302125667 +0000 UTC m=+151.423965216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.821603 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.841627 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:11 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:11 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:11 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.841674 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.892623 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qsq8z"] Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.893798 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hmdm5" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.903061 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23344e84-51b1-4a70-8e06-77aa57659f16-utilities\") pod \"certified-operators-mg9h4\" (UID: \"23344e84-51b1-4a70-8e06-77aa57659f16\") " pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.903118 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.903145 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23344e84-51b1-4a70-8e06-77aa57659f16-catalog-content\") pod \"certified-operators-mg9h4\" (UID: \"23344e84-51b1-4a70-8e06-77aa57659f16\") " pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.903167 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nf5d\" (UniqueName: \"kubernetes.io/projected/23344e84-51b1-4a70-8e06-77aa57659f16-kube-api-access-9nf5d\") pod \"certified-operators-mg9h4\" (UID: \"23344e84-51b1-4a70-8e06-77aa57659f16\") " pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.904232 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23344e84-51b1-4a70-8e06-77aa57659f16-utilities\") pod \"certified-operators-mg9h4\" (UID: \"23344e84-51b1-4a70-8e06-77aa57659f16\") " pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.904441 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23344e84-51b1-4a70-8e06-77aa57659f16-catalog-content\") pod \"certified-operators-mg9h4\" (UID: \"23344e84-51b1-4a70-8e06-77aa57659f16\") " pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:01:11 crc kubenswrapper[4743]: E1125 16:01:11.906031 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:12.405998056 +0000 UTC m=+151.527837775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:11 crc kubenswrapper[4743]: I1125 16:01:11.946199 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nf5d\" (UniqueName: \"kubernetes.io/projected/23344e84-51b1-4a70-8e06-77aa57659f16-kube-api-access-9nf5d\") pod \"certified-operators-mg9h4\" (UID: \"23344e84-51b1-4a70-8e06-77aa57659f16\") " pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.007649 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:12 crc kubenswrapper[4743]: E1125 16:01:12.009090 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:12.509074773 +0000 UTC m=+151.630914322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.080060 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.093343 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.108752 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:12 crc kubenswrapper[4743]: E1125 16:01:12.109180 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:12.609167065 +0000 UTC m=+151.731006614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.131549 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-46wjh"] Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.137063 4743 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 25 16:01:12 crc kubenswrapper[4743]: W1125 16:01:12.168987 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod052deadf_ed22_4688_a6fe_0b1039308499.slice/crio-bcf8998037a328539240ed69571ff4f9022a7f6d1e6e7c6dbdb026847e3a4422 WatchSource:0}: Error finding container bcf8998037a328539240ed69571ff4f9022a7f6d1e6e7c6dbdb026847e3a4422: Status 404 returned error can't find the container with id bcf8998037a328539240ed69571ff4f9022a7f6d1e6e7c6dbdb026847e3a4422 Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.212005 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01b5ca28-1828-40b6-97cf-093f8027dab3-config-volume\") pod \"01b5ca28-1828-40b6-97cf-093f8027dab3\" (UID: \"01b5ca28-1828-40b6-97cf-093f8027dab3\") " Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.212105 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdfgh\" (UniqueName: \"kubernetes.io/projected/01b5ca28-1828-40b6-97cf-093f8027dab3-kube-api-access-hdfgh\") pod \"01b5ca28-1828-40b6-97cf-093f8027dab3\" (UID: \"01b5ca28-1828-40b6-97cf-093f8027dab3\") " Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.212192 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.212233 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01b5ca28-1828-40b6-97cf-093f8027dab3-secret-volume\") pod \"01b5ca28-1828-40b6-97cf-093f8027dab3\" (UID: \"01b5ca28-1828-40b6-97cf-093f8027dab3\") " Nov 25 16:01:12 crc kubenswrapper[4743]: E1125 16:01:12.212626 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:12.712582852 +0000 UTC m=+151.834422401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.212881 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01b5ca28-1828-40b6-97cf-093f8027dab3-config-volume" (OuterVolumeSpecName: "config-volume") pod "01b5ca28-1828-40b6-97cf-093f8027dab3" (UID: "01b5ca28-1828-40b6-97cf-093f8027dab3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.221812 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01b5ca28-1828-40b6-97cf-093f8027dab3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "01b5ca28-1828-40b6-97cf-093f8027dab3" (UID: "01b5ca28-1828-40b6-97cf-093f8027dab3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.223757 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01b5ca28-1828-40b6-97cf-093f8027dab3-kube-api-access-hdfgh" (OuterVolumeSpecName: "kube-api-access-hdfgh") pod "01b5ca28-1828-40b6-97cf-093f8027dab3" (UID: "01b5ca28-1828-40b6-97cf-093f8027dab3"). InnerVolumeSpecName "kube-api-access-hdfgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.310479 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4hd5b"] Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.313649 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.313760 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdfgh\" (UniqueName: \"kubernetes.io/projected/01b5ca28-1828-40b6-97cf-093f8027dab3-kube-api-access-hdfgh\") on node \"crc\" DevicePath \"\"" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.313776 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01b5ca28-1828-40b6-97cf-093f8027dab3-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.313786 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01b5ca28-1828-40b6-97cf-093f8027dab3-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 16:01:12 crc kubenswrapper[4743]: E1125 16:01:12.314009 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:12.813996812 +0000 UTC m=+151.935836361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:12 crc kubenswrapper[4743]: W1125 16:01:12.323157 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a8b728_0425_424c_af4f_5c9a3c20ff4c.slice/crio-639614d12ecd9ea94c500015416732074833873978e78b5fcefcb6296fe4a212 WatchSource:0}: Error finding container 639614d12ecd9ea94c500015416732074833873978e78b5fcefcb6296fe4a212: Status 404 returned error can't find the container with id 639614d12ecd9ea94c500015416732074833873978e78b5fcefcb6296fe4a212 Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.364757 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-55g75 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.364762 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-55g75 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.364815 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-55g75" podUID="2b4b5943-89e2-483d-a034-1344fec03f98" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.364816 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-55g75" podUID="2b4b5943-89e2-483d-a034-1344fec03f98" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.394655 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.394692 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.395440 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.400089 4743 patch_prober.go:28] interesting pod/console-f9d7485db-4sghb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.400171 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4sghb" podUID="d48456d6-9cf0-4cce-8623-9e4b2ff85ab0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.403717 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t6sfd" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.421354 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mg9h4"] Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.422059 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:12 crc kubenswrapper[4743]: E1125 16:01:12.422287 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:12.922265425 +0000 UTC m=+152.044104994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.422445 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:12 crc kubenswrapper[4743]: E1125 16:01:12.423251 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:12.923241563 +0000 UTC m=+152.045081112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:12 crc kubenswrapper[4743]: W1125 16:01:12.456731 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23344e84_51b1_4a70_8e06_77aa57659f16.slice/crio-a2120ac9c7dd45ea9c10a4a05c61003fe07684d730b60ba82e761255a77d9ba1 WatchSource:0}: Error finding container a2120ac9c7dd45ea9c10a4a05c61003fe07684d730b60ba82e761255a77d9ba1: Status 404 returned error can't find the container with id a2120ac9c7dd45ea9c10a4a05c61003fe07684d730b60ba82e761255a77d9ba1 Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.522934 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-zjh85" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.524154 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:12 crc kubenswrapper[4743]: E1125 16:01:12.524417 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:13.024396956 +0000 UTC m=+152.146236505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.524624 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:12 crc kubenswrapper[4743]: E1125 16:01:12.526957 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:13.026946068 +0000 UTC m=+152.148785617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.625328 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.637057 4743 generic.go:334] "Generic (PLEG): container finished" podID="a3cb9b00-75c4-4da6-b6f6-8b92726febae" containerID="ccf40f4501e44f0d01fd04eab991120fa16e391f4d62ac754fb628ac20f95efa" exitCode=0 Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.637156 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qsq8z" event={"ID":"a3cb9b00-75c4-4da6-b6f6-8b92726febae","Type":"ContainerDied","Data":"ccf40f4501e44f0d01fd04eab991120fa16e391f4d62ac754fb628ac20f95efa"} Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.637192 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qsq8z" event={"ID":"a3cb9b00-75c4-4da6-b6f6-8b92726febae","Type":"ContainerStarted","Data":"f7d3bbe473fe402d0e96900137b7bdc2c124836c70917213459278ea82906c07"} Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.639324 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 16:01:12 crc kubenswrapper[4743]: E1125 16:01:12.646912 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:13.146773417 +0000 UTC m=+152.268612966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.648500 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bl29m" event={"ID":"63c6e7d6-50c6-4a7e-b318-edbfa0496006","Type":"ContainerStarted","Data":"c04c0339dba379ee3c41e588887d63bbff69203fa5034c712fd864dd747c2825"} Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.674871 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg9h4" event={"ID":"23344e84-51b1-4a70-8e06-77aa57659f16","Type":"ContainerStarted","Data":"a2120ac9c7dd45ea9c10a4a05c61003fe07684d730b60ba82e761255a77d9ba1"} Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.689395 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f5d99" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.690584 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hd5b" event={"ID":"96a8b728-0425-424c-af4f-5c9a3c20ff4c","Type":"ContainerStarted","Data":"8d034c94ab290f68f50a9639aad1967022acbc412fe3648b17f4b1d89b82a464"} Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.690634 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hd5b" event={"ID":"96a8b728-0425-424c-af4f-5c9a3c20ff4c","Type":"ContainerStarted","Data":"639614d12ecd9ea94c500015416732074833873978e78b5fcefcb6296fe4a212"} Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.703871 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"17b96ea6-b553-46a9-8433-c8259de8b661","Type":"ContainerStarted","Data":"09ca928d74d7cd48170cc36e1358a49a42a7239af128af71713ebf42f2cce209"} Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.704064 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"17b96ea6-b553-46a9-8433-c8259de8b661","Type":"ContainerStarted","Data":"eab21ce38f0b9ce8371e65740b35fbe3fd25194968b5008c327b5a8c65a537fb"} Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.718137 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" event={"ID":"01b5ca28-1828-40b6-97cf-093f8027dab3","Type":"ContainerDied","Data":"c85e15f2c5682c7b56b293b8eb4f040b8cdf644d412a7ea7d49bf0fc8180a68b"} Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.718191 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c85e15f2c5682c7b56b293b8eb4f040b8cdf644d412a7ea7d49bf0fc8180a68b" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.721791 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.725968 4743 generic.go:334] "Generic (PLEG): container finished" podID="052deadf-ed22-4688-a6fe-0b1039308499" containerID="ad900e0a9517100fbe2dc14af056cef1570ee013a4471e542d0e3e538ddfa30d" exitCode=0 Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.726293 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46wjh" event={"ID":"052deadf-ed22-4688-a6fe-0b1039308499","Type":"ContainerDied","Data":"ad900e0a9517100fbe2dc14af056cef1570ee013a4471e542d0e3e538ddfa30d"} Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.726364 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46wjh" event={"ID":"052deadf-ed22-4688-a6fe-0b1039308499","Type":"ContainerStarted","Data":"bcf8998037a328539240ed69571ff4f9022a7f6d1e6e7c6dbdb026847e3a4422"} Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.726415 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.726961 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.730890 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:12 crc kubenswrapper[4743]: E1125 16:01:12.731172 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:13.231156877 +0000 UTC m=+152.352996426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.739117 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.7390965999999999 podStartE2EDuration="1.7390966s" podCreationTimestamp="2025-11-25 16:01:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:12.734486771 +0000 UTC m=+151.856326320" watchObservedRunningTime="2025-11-25 16:01:12.7390966 +0000 UTC m=+151.860936149" Nov 25 16:01:12 crc kubenswrapper[4743]: E1125 16:01:12.789156 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a8b728_0425_424c_af4f_5c9a3c20ff4c.slice/crio-8d034c94ab290f68f50a9639aad1967022acbc412fe3648b17f4b1d89b82a464.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod052deadf_ed22_4688_a6fe_0b1039308499.slice/crio-conmon-ad900e0a9517100fbe2dc14af056cef1570ee013a4471e542d0e3e538ddfa30d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a8b728_0425_424c_af4f_5c9a3c20ff4c.slice/crio-conmon-8d034c94ab290f68f50a9639aad1967022acbc412fe3648b17f4b1d89b82a464.scope\": RecentStats: unable to find data in memory cache]" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.832202 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:12 crc kubenswrapper[4743]: E1125 16:01:12.832420 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:13.332400922 +0000 UTC m=+152.454240471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.833942 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:12 crc kubenswrapper[4743]: E1125 16:01:12.834583 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:13.334571693 +0000 UTC m=+152.456411342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.835319 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.839531 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:12 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:12 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:12 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.839609 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.881107 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4v47p" Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.936381 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:12 crc kubenswrapper[4743]: E1125 16:01:12.936639 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 16:01:13.436607681 +0000 UTC m=+152.558447230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.937072 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:12 crc kubenswrapper[4743]: E1125 16:01:12.937875 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 16:01:13.437858666 +0000 UTC m=+152.559698215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-td7r9" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.958506 4743 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-25T16:01:12.137089853Z","Handler":null,"Name":""} Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.964086 4743 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.964120 4743 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 25 16:01:12 crc kubenswrapper[4743]: I1125 16:01:12.997230 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cbr2m" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.039076 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.047770 4743 patch_prober.go:28] interesting pod/apiserver-76f77b778f-tjsl5 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 25 16:01:13 crc kubenswrapper[4743]: [+]log ok Nov 25 16:01:13 crc kubenswrapper[4743]: [+]etcd ok Nov 25 16:01:13 crc kubenswrapper[4743]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 25 16:01:13 crc kubenswrapper[4743]: [+]poststarthook/generic-apiserver-start-informers ok Nov 25 16:01:13 crc kubenswrapper[4743]: [+]poststarthook/max-in-flight-filter ok Nov 25 16:01:13 crc kubenswrapper[4743]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 25 16:01:13 crc kubenswrapper[4743]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 25 16:01:13 crc kubenswrapper[4743]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 25 16:01:13 crc kubenswrapper[4743]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Nov 25 16:01:13 crc kubenswrapper[4743]: [+]poststarthook/project.openshift.io-projectcache ok Nov 25 16:01:13 crc kubenswrapper[4743]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 25 16:01:13 crc kubenswrapper[4743]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Nov 25 16:01:13 crc kubenswrapper[4743]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 25 16:01:13 crc kubenswrapper[4743]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 25 16:01:13 crc kubenswrapper[4743]: livez check failed Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.047840 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" podUID="50460f60-2828-42ab-94aa-3ae9d13a5a1e" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.056686 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.096486 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vzbnf"] Nov 25 16:01:13 crc kubenswrapper[4743]: E1125 16:01:13.096715 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b5ca28-1828-40b6-97cf-093f8027dab3" containerName="collect-profiles" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.096730 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b5ca28-1828-40b6-97cf-093f8027dab3" containerName="collect-profiles" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.096858 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b5ca28-1828-40b6-97cf-093f8027dab3" containerName="collect-profiles" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.097628 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.099157 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.106748 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vzbnf"] Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.140349 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.195675 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.195723 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.236644 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-td7r9\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.242180 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8897808a-0b5d-4d3b-b512-652b62458b9e-utilities\") pod \"redhat-marketplace-vzbnf\" (UID: \"8897808a-0b5d-4d3b-b512-652b62458b9e\") " pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.242472 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8897808a-0b5d-4d3b-b512-652b62458b9e-catalog-content\") pod \"redhat-marketplace-vzbnf\" (UID: \"8897808a-0b5d-4d3b-b512-652b62458b9e\") " pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.242644 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrsz2\" (UniqueName: \"kubernetes.io/projected/8897808a-0b5d-4d3b-b512-652b62458b9e-kube-api-access-qrsz2\") pod \"redhat-marketplace-vzbnf\" (UID: \"8897808a-0b5d-4d3b-b512-652b62458b9e\") " pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.344150 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8897808a-0b5d-4d3b-b512-652b62458b9e-catalog-content\") pod \"redhat-marketplace-vzbnf\" (UID: \"8897808a-0b5d-4d3b-b512-652b62458b9e\") " pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.344220 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrsz2\" (UniqueName: \"kubernetes.io/projected/8897808a-0b5d-4d3b-b512-652b62458b9e-kube-api-access-qrsz2\") pod \"redhat-marketplace-vzbnf\" (UID: \"8897808a-0b5d-4d3b-b512-652b62458b9e\") " pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.344255 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8897808a-0b5d-4d3b-b512-652b62458b9e-utilities\") pod \"redhat-marketplace-vzbnf\" (UID: \"8897808a-0b5d-4d3b-b512-652b62458b9e\") " pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.344679 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8897808a-0b5d-4d3b-b512-652b62458b9e-catalog-content\") pod \"redhat-marketplace-vzbnf\" (UID: \"8897808a-0b5d-4d3b-b512-652b62458b9e\") " pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.344695 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8897808a-0b5d-4d3b-b512-652b62458b9e-utilities\") pod \"redhat-marketplace-vzbnf\" (UID: \"8897808a-0b5d-4d3b-b512-652b62458b9e\") " pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.365282 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrsz2\" (UniqueName: \"kubernetes.io/projected/8897808a-0b5d-4d3b-b512-652b62458b9e-kube-api-access-qrsz2\") pod \"redhat-marketplace-vzbnf\" (UID: \"8897808a-0b5d-4d3b-b512-652b62458b9e\") " pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.413440 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.499867 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kw4x5"] Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.509676 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.514203 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.519574 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kw4x5"] Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.647906 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-catalog-content\") pod \"redhat-marketplace-kw4x5\" (UID: \"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df\") " pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.648103 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-utilities\") pod \"redhat-marketplace-kw4x5\" (UID: \"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df\") " pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.648144 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cnvt\" (UniqueName: \"kubernetes.io/projected/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-kube-api-access-2cnvt\") pod \"redhat-marketplace-kw4x5\" (UID: \"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df\") " pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.675185 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vzbnf"] Nov 25 16:01:13 crc kubenswrapper[4743]: W1125 16:01:13.685663 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8897808a_0b5d_4d3b_b512_652b62458b9e.slice/crio-ee78891ed724359cbc9dc8342461fc90d8458afcf94846627e46087a1f42d2d5 WatchSource:0}: Error finding container ee78891ed724359cbc9dc8342461fc90d8458afcf94846627e46087a1f42d2d5: Status 404 returned error can't find the container with id ee78891ed724359cbc9dc8342461fc90d8458afcf94846627e46087a1f42d2d5 Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.737905 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bl29m" event={"ID":"63c6e7d6-50c6-4a7e-b318-edbfa0496006","Type":"ContainerStarted","Data":"cb2947ca32798ab30b20586d2fba1d08fd72ee10f72055f1a5e40306deff039e"} Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.742154 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg9h4" event={"ID":"23344e84-51b1-4a70-8e06-77aa57659f16","Type":"ContainerDied","Data":"45c6fe954cb22a5d5734bd6d30d746e912b062f74f28841f6dc31a11714b31b3"} Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.741756 4743 generic.go:334] "Generic (PLEG): container finished" podID="23344e84-51b1-4a70-8e06-77aa57659f16" containerID="45c6fe954cb22a5d5734bd6d30d746e912b062f74f28841f6dc31a11714b31b3" exitCode=0 Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.746332 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzbnf" event={"ID":"8897808a-0b5d-4d3b-b512-652b62458b9e","Type":"ContainerStarted","Data":"ee78891ed724359cbc9dc8342461fc90d8458afcf94846627e46087a1f42d2d5"} Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.752337 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-catalog-content\") pod \"redhat-marketplace-kw4x5\" (UID: \"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df\") " pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.752494 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-utilities\") pod \"redhat-marketplace-kw4x5\" (UID: \"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df\") " pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.752534 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cnvt\" (UniqueName: \"kubernetes.io/projected/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-kube-api-access-2cnvt\") pod \"redhat-marketplace-kw4x5\" (UID: \"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df\") " pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.752853 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-catalog-content\") pod \"redhat-marketplace-kw4x5\" (UID: \"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df\") " pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.752862 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-utilities\") pod \"redhat-marketplace-kw4x5\" (UID: \"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df\") " pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.762854 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-td7r9"] Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.764084 4743 generic.go:334] "Generic (PLEG): container finished" podID="96a8b728-0425-424c-af4f-5c9a3c20ff4c" containerID="8d034c94ab290f68f50a9639aad1967022acbc412fe3648b17f4b1d89b82a464" exitCode=0 Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.764145 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hd5b" event={"ID":"96a8b728-0425-424c-af4f-5c9a3c20ff4c","Type":"ContainerDied","Data":"8d034c94ab290f68f50a9639aad1967022acbc412fe3648b17f4b1d89b82a464"} Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.765430 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-bl29m" podStartSLOduration=13.765413325 podStartE2EDuration="13.765413325s" podCreationTimestamp="2025-11-25 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:13.759196959 +0000 UTC m=+152.881036518" watchObservedRunningTime="2025-11-25 16:01:13.765413325 +0000 UTC m=+152.887252874" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.769522 4743 generic.go:334] "Generic (PLEG): container finished" podID="17b96ea6-b553-46a9-8433-c8259de8b661" containerID="09ca928d74d7cd48170cc36e1358a49a42a7239af128af71713ebf42f2cce209" exitCode=0 Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.769898 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"17b96ea6-b553-46a9-8433-c8259de8b661","Type":"ContainerDied","Data":"09ca928d74d7cd48170cc36e1358a49a42a7239af128af71713ebf42f2cce209"} Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.782309 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cnvt\" (UniqueName: \"kubernetes.io/projected/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-kube-api-access-2cnvt\") pod \"redhat-marketplace-kw4x5\" (UID: \"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df\") " pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.793586 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 25 16:01:13 crc kubenswrapper[4743]: W1125 16:01:13.799427 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458c2ebd_ea67_4efc_b058_142de4fce612.slice/crio-54e17b5f5395fc265fb5c93d9dbc767801234a3f5e596fbdff72a0f75963d82c WatchSource:0}: Error finding container 54e17b5f5395fc265fb5c93d9dbc767801234a3f5e596fbdff72a0f75963d82c: Status 404 returned error can't find the container with id 54e17b5f5395fc265fb5c93d9dbc767801234a3f5e596fbdff72a0f75963d82c Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.847337 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:13 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:13 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:13 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.847398 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:13 crc kubenswrapper[4743]: I1125 16:01:13.862032 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.299166 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7rfdr"] Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.300425 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.303869 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.324024 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7rfdr"] Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.382026 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.383370 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.386556 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.387438 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.395062 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.441814 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kw4x5"] Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.468368 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-247rj\" (UniqueName: \"kubernetes.io/projected/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-kube-api-access-247rj\") pod \"redhat-operators-7rfdr\" (UID: \"11a83874-5ff8-4255-b12b-d25ee3e9f5f4\") " pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.468445 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-utilities\") pod \"redhat-operators-7rfdr\" (UID: \"11a83874-5ff8-4255-b12b-d25ee3e9f5f4\") " pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.468472 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-catalog-content\") pod \"redhat-operators-7rfdr\" (UID: \"11a83874-5ff8-4255-b12b-d25ee3e9f5f4\") " pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.570131 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17c4384f-181e-4bfc-9729-ca66f0d6bfdb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"17c4384f-181e-4bfc-9729-ca66f0d6bfdb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.570210 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-247rj\" (UniqueName: \"kubernetes.io/projected/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-kube-api-access-247rj\") pod \"redhat-operators-7rfdr\" (UID: \"11a83874-5ff8-4255-b12b-d25ee3e9f5f4\") " pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.570255 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-utilities\") pod \"redhat-operators-7rfdr\" (UID: \"11a83874-5ff8-4255-b12b-d25ee3e9f5f4\") " pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.570275 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-catalog-content\") pod \"redhat-operators-7rfdr\" (UID: \"11a83874-5ff8-4255-b12b-d25ee3e9f5f4\") " pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.570296 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17c4384f-181e-4bfc-9729-ca66f0d6bfdb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"17c4384f-181e-4bfc-9729-ca66f0d6bfdb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.571284 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-utilities\") pod \"redhat-operators-7rfdr\" (UID: \"11a83874-5ff8-4255-b12b-d25ee3e9f5f4\") " pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.571492 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-catalog-content\") pod \"redhat-operators-7rfdr\" (UID: \"11a83874-5ff8-4255-b12b-d25ee3e9f5f4\") " pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.589527 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-247rj\" (UniqueName: \"kubernetes.io/projected/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-kube-api-access-247rj\") pod \"redhat-operators-7rfdr\" (UID: \"11a83874-5ff8-4255-b12b-d25ee3e9f5f4\") " pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.638834 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.671051 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17c4384f-181e-4bfc-9729-ca66f0d6bfdb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"17c4384f-181e-4bfc-9729-ca66f0d6bfdb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.671122 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17c4384f-181e-4bfc-9729-ca66f0d6bfdb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"17c4384f-181e-4bfc-9729-ca66f0d6bfdb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.671213 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17c4384f-181e-4bfc-9729-ca66f0d6bfdb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"17c4384f-181e-4bfc-9729-ca66f0d6bfdb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.689777 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17c4384f-181e-4bfc-9729-ca66f0d6bfdb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"17c4384f-181e-4bfc-9729-ca66f0d6bfdb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.699836 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.707292 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-djxzs"] Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.708553 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djxzs" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.716783 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-djxzs"] Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.787687 4743 generic.go:334] "Generic (PLEG): container finished" podID="ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" containerID="05f36697cc41613a1b38207d56930956b12608b8205f3cfadcc7b76a1e90e678" exitCode=0 Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.787762 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kw4x5" event={"ID":"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df","Type":"ContainerDied","Data":"05f36697cc41613a1b38207d56930956b12608b8205f3cfadcc7b76a1e90e678"} Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.787792 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kw4x5" event={"ID":"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df","Type":"ContainerStarted","Data":"e2c642ddc57bebb1c06764b62b452aac9b6c6d37fdb173a772a03d325d4dd031"} Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.791053 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" event={"ID":"458c2ebd-ea67-4efc-b058-142de4fce612","Type":"ContainerStarted","Data":"3be63306ab05ec1ef8da1e67f3266e85e16630818ac5aab67445b1f638db1498"} Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.791124 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" event={"ID":"458c2ebd-ea67-4efc-b058-142de4fce612","Type":"ContainerStarted","Data":"54e17b5f5395fc265fb5c93d9dbc767801234a3f5e596fbdff72a0f75963d82c"} Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.791796 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.794494 4743 generic.go:334] "Generic (PLEG): container finished" podID="8897808a-0b5d-4d3b-b512-652b62458b9e" containerID="f9a9798d425fe5e74513e4ea933986f7e8952cdf06c6f7abe5dbf2a0de8fb35d" exitCode=0 Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.795321 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzbnf" event={"ID":"8897808a-0b5d-4d3b-b512-652b62458b9e","Type":"ContainerDied","Data":"f9a9798d425fe5e74513e4ea933986f7e8952cdf06c6f7abe5dbf2a0de8fb35d"} Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.833205 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" podStartSLOduration=130.83318280700001 podStartE2EDuration="2m10.833182807s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:14.831814969 +0000 UTC m=+153.953654528" watchObservedRunningTime="2025-11-25 16:01:14.833182807 +0000 UTC m=+153.955022356" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.841616 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:14 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:14 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:14 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.841683 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.873357 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-utilities\") pod \"redhat-operators-djxzs\" (UID: \"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24\") " pod="openshift-marketplace/redhat-operators-djxzs" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.873458 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-catalog-content\") pod \"redhat-operators-djxzs\" (UID: \"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24\") " pod="openshift-marketplace/redhat-operators-djxzs" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.873499 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks7mb\" (UniqueName: \"kubernetes.io/projected/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-kube-api-access-ks7mb\") pod \"redhat-operators-djxzs\" (UID: \"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24\") " pod="openshift-marketplace/redhat-operators-djxzs" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.976464 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-catalog-content\") pod \"redhat-operators-djxzs\" (UID: \"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24\") " pod="openshift-marketplace/redhat-operators-djxzs" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.976684 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks7mb\" (UniqueName: \"kubernetes.io/projected/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-kube-api-access-ks7mb\") pod \"redhat-operators-djxzs\" (UID: \"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24\") " pod="openshift-marketplace/redhat-operators-djxzs" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.976751 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-utilities\") pod \"redhat-operators-djxzs\" (UID: \"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24\") " pod="openshift-marketplace/redhat-operators-djxzs" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.977692 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-catalog-content\") pod \"redhat-operators-djxzs\" (UID: \"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24\") " pod="openshift-marketplace/redhat-operators-djxzs" Nov 25 16:01:14 crc kubenswrapper[4743]: I1125 16:01:14.980179 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-utilities\") pod \"redhat-operators-djxzs\" (UID: \"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24\") " pod="openshift-marketplace/redhat-operators-djxzs" Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.009621 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks7mb\" (UniqueName: \"kubernetes.io/projected/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-kube-api-access-ks7mb\") pod \"redhat-operators-djxzs\" (UID: \"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24\") " pod="openshift-marketplace/redhat-operators-djxzs" Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.044837 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7rfdr"] Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.080633 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djxzs" Nov 25 16:01:15 crc kubenswrapper[4743]: W1125 16:01:15.113353 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11a83874_5ff8_4255_b12b_d25ee3e9f5f4.slice/crio-e9333a331cd760bdc6fe740c4cfe33112c7d0eaafb5f1db5783b00422731195e WatchSource:0}: Error finding container e9333a331cd760bdc6fe740c4cfe33112c7d0eaafb5f1db5783b00422731195e: Status 404 returned error can't find the container with id e9333a331cd760bdc6fe740c4cfe33112c7d0eaafb5f1db5783b00422731195e Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.154248 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.164472 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.284715 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17b96ea6-b553-46a9-8433-c8259de8b661-kubelet-dir\") pod \"17b96ea6-b553-46a9-8433-c8259de8b661\" (UID: \"17b96ea6-b553-46a9-8433-c8259de8b661\") " Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.284832 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17b96ea6-b553-46a9-8433-c8259de8b661-kube-api-access\") pod \"17b96ea6-b553-46a9-8433-c8259de8b661\" (UID: \"17b96ea6-b553-46a9-8433-c8259de8b661\") " Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.285755 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17b96ea6-b553-46a9-8433-c8259de8b661-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "17b96ea6-b553-46a9-8433-c8259de8b661" (UID: "17b96ea6-b553-46a9-8433-c8259de8b661"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.299179 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b96ea6-b553-46a9-8433-c8259de8b661-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "17b96ea6-b553-46a9-8433-c8259de8b661" (UID: "17b96ea6-b553-46a9-8433-c8259de8b661"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.341740 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5xxzs" Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.397488 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17b96ea6-b553-46a9-8433-c8259de8b661-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.397523 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17b96ea6-b553-46a9-8433-c8259de8b661-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.511183 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-djxzs"] Nov 25 16:01:15 crc kubenswrapper[4743]: W1125 16:01:15.525413 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod518378d4_cd4d_40dd_bf66_3ba3e0dc9e24.slice/crio-770b58829b67dbf4ba0f78b65dd940604a8e0b446b80ac44661a714959318a3c WatchSource:0}: Error finding container 770b58829b67dbf4ba0f78b65dd940604a8e0b446b80ac44661a714959318a3c: Status 404 returned error can't find the container with id 770b58829b67dbf4ba0f78b65dd940604a8e0b446b80ac44661a714959318a3c Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.806356 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"17c4384f-181e-4bfc-9729-ca66f0d6bfdb","Type":"ContainerStarted","Data":"e8e1ae10ebc28ef1fcdb26e7d28ad8d235f5fd0bc29f44ef54b60eaa61dc1f88"} Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.808417 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"17b96ea6-b553-46a9-8433-c8259de8b661","Type":"ContainerDied","Data":"eab21ce38f0b9ce8371e65740b35fbe3fd25194968b5008c327b5a8c65a537fb"} Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.808461 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.808470 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eab21ce38f0b9ce8371e65740b35fbe3fd25194968b5008c327b5a8c65a537fb" Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.810003 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djxzs" event={"ID":"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24","Type":"ContainerStarted","Data":"770b58829b67dbf4ba0f78b65dd940604a8e0b446b80ac44661a714959318a3c"} Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.812775 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rfdr" event={"ID":"11a83874-5ff8-4255-b12b-d25ee3e9f5f4","Type":"ContainerStarted","Data":"e9333a331cd760bdc6fe740c4cfe33112c7d0eaafb5f1db5783b00422731195e"} Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.839082 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:15 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:15 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:15 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:15 crc kubenswrapper[4743]: I1125 16:01:15.839159 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:16 crc kubenswrapper[4743]: I1125 16:01:16.834038 4743 generic.go:334] "Generic (PLEG): container finished" podID="518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" containerID="e4db81339c378fa88b4d50a026cd27e088d6cfd4c614ec1f886ba1137b91892f" exitCode=0 Nov 25 16:01:16 crc kubenswrapper[4743]: I1125 16:01:16.834127 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djxzs" event={"ID":"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24","Type":"ContainerDied","Data":"e4db81339c378fa88b4d50a026cd27e088d6cfd4c614ec1f886ba1137b91892f"} Nov 25 16:01:16 crc kubenswrapper[4743]: I1125 16:01:16.843619 4743 generic.go:334] "Generic (PLEG): container finished" podID="11a83874-5ff8-4255-b12b-d25ee3e9f5f4" containerID="2a86f81a16b515a48a788dbb7670e9a6a4216925067980f0da265a2898bd6548" exitCode=0 Nov 25 16:01:16 crc kubenswrapper[4743]: I1125 16:01:16.843665 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rfdr" event={"ID":"11a83874-5ff8-4255-b12b-d25ee3e9f5f4","Type":"ContainerDied","Data":"2a86f81a16b515a48a788dbb7670e9a6a4216925067980f0da265a2898bd6548"} Nov 25 16:01:16 crc kubenswrapper[4743]: I1125 16:01:16.845802 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:16 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:16 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:16 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:16 crc kubenswrapper[4743]: I1125 16:01:16.845836 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:16 crc kubenswrapper[4743]: I1125 16:01:16.849907 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"17c4384f-181e-4bfc-9729-ca66f0d6bfdb","Type":"ContainerStarted","Data":"d82f969e790f7db0fc223488b4746ad202ca0172634cdc200aa0951bd9e9144c"} Nov 25 16:01:16 crc kubenswrapper[4743]: I1125 16:01:16.899516 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.89949884 podStartE2EDuration="2.89949884s" podCreationTimestamp="2025-11-25 16:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:16.894655644 +0000 UTC m=+156.016495213" watchObservedRunningTime="2025-11-25 16:01:16.89949884 +0000 UTC m=+156.021338389" Nov 25 16:01:17 crc kubenswrapper[4743]: I1125 16:01:17.732135 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:17 crc kubenswrapper[4743]: I1125 16:01:17.742133 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-tjsl5" Nov 25 16:01:17 crc kubenswrapper[4743]: I1125 16:01:17.853469 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:17 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:17 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:17 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:17 crc kubenswrapper[4743]: I1125 16:01:17.853541 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:17 crc kubenswrapper[4743]: I1125 16:01:17.892565 4743 generic.go:334] "Generic (PLEG): container finished" podID="17c4384f-181e-4bfc-9729-ca66f0d6bfdb" containerID="d82f969e790f7db0fc223488b4746ad202ca0172634cdc200aa0951bd9e9144c" exitCode=0 Nov 25 16:01:17 crc kubenswrapper[4743]: I1125 16:01:17.893693 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"17c4384f-181e-4bfc-9729-ca66f0d6bfdb","Type":"ContainerDied","Data":"d82f969e790f7db0fc223488b4746ad202ca0172634cdc200aa0951bd9e9144c"} Nov 25 16:01:18 crc kubenswrapper[4743]: I1125 16:01:18.838984 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:18 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:18 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:18 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:18 crc kubenswrapper[4743]: I1125 16:01:18.839059 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:19 crc kubenswrapper[4743]: I1125 16:01:19.838510 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:19 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:19 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:19 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:19 crc kubenswrapper[4743]: I1125 16:01:19.838576 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:20 crc kubenswrapper[4743]: I1125 16:01:20.077438 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:01:20 crc kubenswrapper[4743]: I1125 16:01:20.077514 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:01:20 crc kubenswrapper[4743]: I1125 16:01:20.838368 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:20 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:20 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:20 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:20 crc kubenswrapper[4743]: I1125 16:01:20.838447 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:21 crc kubenswrapper[4743]: I1125 16:01:21.837606 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:21 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:21 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:21 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:21 crc kubenswrapper[4743]: I1125 16:01:21.837922 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:22 crc kubenswrapper[4743]: I1125 16:01:22.362529 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-55g75 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Nov 25 16:01:22 crc kubenswrapper[4743]: I1125 16:01:22.362607 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-55g75" podUID="2b4b5943-89e2-483d-a034-1344fec03f98" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Nov 25 16:01:22 crc kubenswrapper[4743]: I1125 16:01:22.362617 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-55g75 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Nov 25 16:01:22 crc kubenswrapper[4743]: I1125 16:01:22.362686 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-55g75" podUID="2b4b5943-89e2-483d-a034-1344fec03f98" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Nov 25 16:01:22 crc kubenswrapper[4743]: I1125 16:01:22.394645 4743 patch_prober.go:28] interesting pod/console-f9d7485db-4sghb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Nov 25 16:01:22 crc kubenswrapper[4743]: I1125 16:01:22.394695 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4sghb" podUID="d48456d6-9cf0-4cce-8623-9e4b2ff85ab0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" Nov 25 16:01:22 crc kubenswrapper[4743]: I1125 16:01:22.837559 4743 patch_prober.go:28] interesting pod/router-default-5444994796-ldzq4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 16:01:22 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 25 16:01:22 crc kubenswrapper[4743]: [+]process-running ok Nov 25 16:01:22 crc kubenswrapper[4743]: healthz check failed Nov 25 16:01:22 crc kubenswrapper[4743]: I1125 16:01:22.837647 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ldzq4" podUID="505a2c86-f87d-4179-9156-7c6b98ba9b84" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 16:01:23 crc kubenswrapper[4743]: I1125 16:01:23.838814 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:23 crc kubenswrapper[4743]: I1125 16:01:23.843018 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-ldzq4" Nov 25 16:01:26 crc kubenswrapper[4743]: I1125 16:01:26.877120 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs\") pod \"network-metrics-daemon-s9t79\" (UID: \"617512f9-f767-4615-a9d2-132c6c73a69d\") " pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:01:26 crc kubenswrapper[4743]: I1125 16:01:26.883305 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/617512f9-f767-4615-a9d2-132c6c73a69d-metrics-certs\") pod \"network-metrics-daemon-s9t79\" (UID: \"617512f9-f767-4615-a9d2-132c6c73a69d\") " pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:01:26 crc kubenswrapper[4743]: I1125 16:01:26.992895 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s9t79" Nov 25 16:01:27 crc kubenswrapper[4743]: I1125 16:01:27.666814 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 16:01:27 crc kubenswrapper[4743]: I1125 16:01:27.686969 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17c4384f-181e-4bfc-9729-ca66f0d6bfdb-kube-api-access\") pod \"17c4384f-181e-4bfc-9729-ca66f0d6bfdb\" (UID: \"17c4384f-181e-4bfc-9729-ca66f0d6bfdb\") " Nov 25 16:01:27 crc kubenswrapper[4743]: I1125 16:01:27.692751 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17c4384f-181e-4bfc-9729-ca66f0d6bfdb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "17c4384f-181e-4bfc-9729-ca66f0d6bfdb" (UID: "17c4384f-181e-4bfc-9729-ca66f0d6bfdb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:01:27 crc kubenswrapper[4743]: I1125 16:01:27.788061 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17c4384f-181e-4bfc-9729-ca66f0d6bfdb-kubelet-dir\") pod \"17c4384f-181e-4bfc-9729-ca66f0d6bfdb\" (UID: \"17c4384f-181e-4bfc-9729-ca66f0d6bfdb\") " Nov 25 16:01:27 crc kubenswrapper[4743]: I1125 16:01:27.788360 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17c4384f-181e-4bfc-9729-ca66f0d6bfdb-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 16:01:27 crc kubenswrapper[4743]: I1125 16:01:27.788358 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17c4384f-181e-4bfc-9729-ca66f0d6bfdb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "17c4384f-181e-4bfc-9729-ca66f0d6bfdb" (UID: "17c4384f-181e-4bfc-9729-ca66f0d6bfdb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:01:27 crc kubenswrapper[4743]: I1125 16:01:27.890303 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17c4384f-181e-4bfc-9729-ca66f0d6bfdb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 16:01:27 crc kubenswrapper[4743]: I1125 16:01:27.955517 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"17c4384f-181e-4bfc-9729-ca66f0d6bfdb","Type":"ContainerDied","Data":"e8e1ae10ebc28ef1fcdb26e7d28ad8d235f5fd0bc29f44ef54b60eaa61dc1f88"} Nov 25 16:01:27 crc kubenswrapper[4743]: I1125 16:01:27.955555 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8e1ae10ebc28ef1fcdb26e7d28ad8d235f5fd0bc29f44ef54b60eaa61dc1f88" Nov 25 16:01:27 crc kubenswrapper[4743]: I1125 16:01:27.955569 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 16:01:32 crc kubenswrapper[4743]: I1125 16:01:32.368293 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-55g75" Nov 25 16:01:32 crc kubenswrapper[4743]: I1125 16:01:32.401361 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:32 crc kubenswrapper[4743]: I1125 16:01:32.406013 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:01:33 crc kubenswrapper[4743]: I1125 16:01:33.519515 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:01:40 crc kubenswrapper[4743]: E1125 16:01:40.385819 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 25 16:01:40 crc kubenswrapper[4743]: E1125 16:01:40.387063 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6knp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4hd5b_openshift-marketplace(96a8b728-0425-424c-af4f-5c9a3c20ff4c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 16:01:40 crc kubenswrapper[4743]: E1125 16:01:40.388313 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4hd5b" podUID="96a8b728-0425-424c-af4f-5c9a3c20ff4c" Nov 25 16:01:42 crc kubenswrapper[4743]: I1125 16:01:42.856122 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r5h55" Nov 25 16:01:44 crc kubenswrapper[4743]: E1125 16:01:44.091082 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 25 16:01:44 crc kubenswrapper[4743]: E1125 16:01:44.091247 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2cnvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kw4x5_openshift-marketplace(ac1b537d-a3a6-4b63-9fa4-d0b15088c3df): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 16:01:44 crc kubenswrapper[4743]: E1125 16:01:44.092516 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-kw4x5" podUID="ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" Nov 25 16:01:49 crc kubenswrapper[4743]: I1125 16:01:49.276945 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 16:01:50 crc kubenswrapper[4743]: I1125 16:01:50.078428 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:01:50 crc kubenswrapper[4743]: I1125 16:01:50.078543 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:01:50 crc kubenswrapper[4743]: E1125 16:01:50.910795 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kw4x5" podUID="ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" Nov 25 16:01:53 crc kubenswrapper[4743]: I1125 16:01:53.711124 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 25 16:01:53 crc kubenswrapper[4743]: E1125 16:01:53.711799 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c4384f-181e-4bfc-9729-ca66f0d6bfdb" containerName="pruner" Nov 25 16:01:53 crc kubenswrapper[4743]: I1125 16:01:53.711815 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c4384f-181e-4bfc-9729-ca66f0d6bfdb" containerName="pruner" Nov 25 16:01:53 crc kubenswrapper[4743]: E1125 16:01:53.711831 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b96ea6-b553-46a9-8433-c8259de8b661" containerName="pruner" Nov 25 16:01:53 crc kubenswrapper[4743]: I1125 16:01:53.711839 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b96ea6-b553-46a9-8433-c8259de8b661" containerName="pruner" Nov 25 16:01:53 crc kubenswrapper[4743]: I1125 16:01:53.711968 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="17b96ea6-b553-46a9-8433-c8259de8b661" containerName="pruner" Nov 25 16:01:53 crc kubenswrapper[4743]: I1125 16:01:53.711989 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c4384f-181e-4bfc-9729-ca66f0d6bfdb" containerName="pruner" Nov 25 16:01:53 crc kubenswrapper[4743]: I1125 16:01:53.712458 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 16:01:53 crc kubenswrapper[4743]: I1125 16:01:53.715844 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 25 16:01:53 crc kubenswrapper[4743]: I1125 16:01:53.716438 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 25 16:01:53 crc kubenswrapper[4743]: I1125 16:01:53.727108 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 25 16:01:53 crc kubenswrapper[4743]: E1125 16:01:53.792332 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 25 16:01:53 crc kubenswrapper[4743]: E1125 16:01:53.792552 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrsz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vzbnf_openshift-marketplace(8897808a-0b5d-4d3b-b512-652b62458b9e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 16:01:53 crc kubenswrapper[4743]: E1125 16:01:53.793976 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vzbnf" podUID="8897808a-0b5d-4d3b-b512-652b62458b9e" Nov 25 16:01:53 crc kubenswrapper[4743]: I1125 16:01:53.845002 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9bfce331-5ee3-49fd-a4cd-53e4b1b93367-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9bfce331-5ee3-49fd-a4cd-53e4b1b93367\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 16:01:53 crc kubenswrapper[4743]: I1125 16:01:53.845499 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bfce331-5ee3-49fd-a4cd-53e4b1b93367-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9bfce331-5ee3-49fd-a4cd-53e4b1b93367\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 16:01:53 crc kubenswrapper[4743]: I1125 16:01:53.947092 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9bfce331-5ee3-49fd-a4cd-53e4b1b93367-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9bfce331-5ee3-49fd-a4cd-53e4b1b93367\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 16:01:53 crc kubenswrapper[4743]: I1125 16:01:53.947163 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bfce331-5ee3-49fd-a4cd-53e4b1b93367-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9bfce331-5ee3-49fd-a4cd-53e4b1b93367\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 16:01:53 crc kubenswrapper[4743]: I1125 16:01:53.947247 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bfce331-5ee3-49fd-a4cd-53e4b1b93367-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9bfce331-5ee3-49fd-a4cd-53e4b1b93367\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 16:01:53 crc kubenswrapper[4743]: I1125 16:01:53.971520 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9bfce331-5ee3-49fd-a4cd-53e4b1b93367-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9bfce331-5ee3-49fd-a4cd-53e4b1b93367\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 16:01:54 crc kubenswrapper[4743]: I1125 16:01:54.041974 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 16:01:56 crc kubenswrapper[4743]: E1125 16:01:56.712143 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vzbnf" podUID="8897808a-0b5d-4d3b-b512-652b62458b9e" Nov 25 16:01:56 crc kubenswrapper[4743]: E1125 16:01:56.739802 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 25 16:01:56 crc kubenswrapper[4743]: E1125 16:01:56.739949 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ks7mb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-djxzs_openshift-marketplace(518378d4-cd4d-40dd-bf66-3ba3e0dc9e24): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 16:01:56 crc kubenswrapper[4743]: E1125 16:01:56.741048 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-djxzs" podUID="518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" Nov 25 16:01:56 crc kubenswrapper[4743]: E1125 16:01:56.768172 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 25 16:01:56 crc kubenswrapper[4743]: E1125 16:01:56.768311 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-247rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7rfdr_openshift-marketplace(11a83874-5ff8-4255-b12b-d25ee3e9f5f4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 16:01:56 crc kubenswrapper[4743]: E1125 16:01:56.770286 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7rfdr" podUID="11a83874-5ff8-4255-b12b-d25ee3e9f5f4" Nov 25 16:01:57 crc kubenswrapper[4743]: I1125 16:01:57.129303 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qsq8z" event={"ID":"a3cb9b00-75c4-4da6-b6f6-8b92726febae","Type":"ContainerStarted","Data":"226626d725cf1820d7fc9d332b462337d513b9d8fb083001ecddf14c01de9457"} Nov 25 16:01:57 crc kubenswrapper[4743]: I1125 16:01:57.133045 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg9h4" event={"ID":"23344e84-51b1-4a70-8e06-77aa57659f16","Type":"ContainerStarted","Data":"ff71d6f9dc57936b224de21a233ba2af5d8843d7ef3394131f50dba9965176a4"} Nov 25 16:01:57 crc kubenswrapper[4743]: I1125 16:01:57.137137 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hd5b" event={"ID":"96a8b728-0425-424c-af4f-5c9a3c20ff4c","Type":"ContainerStarted","Data":"aa402a519386ac3c3f53bbd8cf17148b3494ade4a657a17789e9691ee2835453"} Nov 25 16:01:57 crc kubenswrapper[4743]: I1125 16:01:57.139751 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46wjh" event={"ID":"052deadf-ed22-4688-a6fe-0b1039308499","Type":"ContainerStarted","Data":"a3918d7b98c7f735047f2d566fbc08040cf9d9f506834a9612aaaa0cbe365e2b"} Nov 25 16:01:57 crc kubenswrapper[4743]: E1125 16:01:57.141786 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7rfdr" podUID="11a83874-5ff8-4255-b12b-d25ee3e9f5f4" Nov 25 16:01:57 crc kubenswrapper[4743]: E1125 16:01:57.141869 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-djxzs" podUID="518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" Nov 25 16:01:57 crc kubenswrapper[4743]: I1125 16:01:57.157139 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-s9t79"] Nov 25 16:01:57 crc kubenswrapper[4743]: I1125 16:01:57.215176 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 25 16:01:57 crc kubenswrapper[4743]: W1125 16:01:57.246403 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9bfce331_5ee3_49fd_a4cd_53e4b1b93367.slice/crio-76913c0ecb1df776e862391eccd948585def6d34ed6b3986559712adcb2c2988 WatchSource:0}: Error finding container 76913c0ecb1df776e862391eccd948585def6d34ed6b3986559712adcb2c2988: Status 404 returned error can't find the container with id 76913c0ecb1df776e862391eccd948585def6d34ed6b3986559712adcb2c2988 Nov 25 16:01:58 crc kubenswrapper[4743]: I1125 16:01:58.147095 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s9t79" event={"ID":"617512f9-f767-4615-a9d2-132c6c73a69d","Type":"ContainerStarted","Data":"602ec7d2172fc8f76394112763e63e4fa13d738995fdeac68777a766241fd473"} Nov 25 16:01:58 crc kubenswrapper[4743]: I1125 16:01:58.147485 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s9t79" event={"ID":"617512f9-f767-4615-a9d2-132c6c73a69d","Type":"ContainerStarted","Data":"2a6cb761ab22c52eae41dfb2e333bbe09d67a41c3699e3da7d434a16f85433fb"} Nov 25 16:01:58 crc kubenswrapper[4743]: I1125 16:01:58.147500 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s9t79" event={"ID":"617512f9-f767-4615-a9d2-132c6c73a69d","Type":"ContainerStarted","Data":"40bd001fca4fa1e7847f85b02152967d209dd41ae45810a06a69384f07804aec"} Nov 25 16:01:58 crc kubenswrapper[4743]: I1125 16:01:58.150719 4743 generic.go:334] "Generic (PLEG): container finished" podID="96a8b728-0425-424c-af4f-5c9a3c20ff4c" containerID="aa402a519386ac3c3f53bbd8cf17148b3494ade4a657a17789e9691ee2835453" exitCode=0 Nov 25 16:01:58 crc kubenswrapper[4743]: I1125 16:01:58.150798 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hd5b" event={"ID":"96a8b728-0425-424c-af4f-5c9a3c20ff4c","Type":"ContainerDied","Data":"aa402a519386ac3c3f53bbd8cf17148b3494ade4a657a17789e9691ee2835453"} Nov 25 16:01:58 crc kubenswrapper[4743]: I1125 16:01:58.152446 4743 generic.go:334] "Generic (PLEG): container finished" podID="9bfce331-5ee3-49fd-a4cd-53e4b1b93367" containerID="a072b18f966b7371092b97aa06314441f3e6c565f1881b7cba25f6f873632051" exitCode=0 Nov 25 16:01:58 crc kubenswrapper[4743]: I1125 16:01:58.152478 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9bfce331-5ee3-49fd-a4cd-53e4b1b93367","Type":"ContainerDied","Data":"a072b18f966b7371092b97aa06314441f3e6c565f1881b7cba25f6f873632051"} Nov 25 16:01:58 crc kubenswrapper[4743]: I1125 16:01:58.152502 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9bfce331-5ee3-49fd-a4cd-53e4b1b93367","Type":"ContainerStarted","Data":"76913c0ecb1df776e862391eccd948585def6d34ed6b3986559712adcb2c2988"} Nov 25 16:01:58 crc kubenswrapper[4743]: I1125 16:01:58.154760 4743 generic.go:334] "Generic (PLEG): container finished" podID="052deadf-ed22-4688-a6fe-0b1039308499" containerID="a3918d7b98c7f735047f2d566fbc08040cf9d9f506834a9612aaaa0cbe365e2b" exitCode=0 Nov 25 16:01:58 crc kubenswrapper[4743]: I1125 16:01:58.154818 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46wjh" event={"ID":"052deadf-ed22-4688-a6fe-0b1039308499","Type":"ContainerDied","Data":"a3918d7b98c7f735047f2d566fbc08040cf9d9f506834a9612aaaa0cbe365e2b"} Nov 25 16:01:58 crc kubenswrapper[4743]: I1125 16:01:58.157065 4743 generic.go:334] "Generic (PLEG): container finished" podID="a3cb9b00-75c4-4da6-b6f6-8b92726febae" containerID="226626d725cf1820d7fc9d332b462337d513b9d8fb083001ecddf14c01de9457" exitCode=0 Nov 25 16:01:58 crc kubenswrapper[4743]: I1125 16:01:58.157129 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qsq8z" event={"ID":"a3cb9b00-75c4-4da6-b6f6-8b92726febae","Type":"ContainerDied","Data":"226626d725cf1820d7fc9d332b462337d513b9d8fb083001ecddf14c01de9457"} Nov 25 16:01:58 crc kubenswrapper[4743]: I1125 16:01:58.160395 4743 generic.go:334] "Generic (PLEG): container finished" podID="23344e84-51b1-4a70-8e06-77aa57659f16" containerID="ff71d6f9dc57936b224de21a233ba2af5d8843d7ef3394131f50dba9965176a4" exitCode=0 Nov 25 16:01:58 crc kubenswrapper[4743]: I1125 16:01:58.160431 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg9h4" event={"ID":"23344e84-51b1-4a70-8e06-77aa57659f16","Type":"ContainerDied","Data":"ff71d6f9dc57936b224de21a233ba2af5d8843d7ef3394131f50dba9965176a4"} Nov 25 16:01:58 crc kubenswrapper[4743]: I1125 16:01:58.166112 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-s9t79" podStartSLOduration=174.166094182 podStartE2EDuration="2m54.166094182s" podCreationTimestamp="2025-11-25 15:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:58.163639959 +0000 UTC m=+197.285479528" watchObservedRunningTime="2025-11-25 16:01:58.166094182 +0000 UTC m=+197.287933721" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.170851 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qsq8z" event={"ID":"a3cb9b00-75c4-4da6-b6f6-8b92726febae","Type":"ContainerStarted","Data":"1eb07f805fe0baf72e7c20b7d6f1b12fc8ac0e69307c5507d975de7dd4e19286"} Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.174109 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg9h4" event={"ID":"23344e84-51b1-4a70-8e06-77aa57659f16","Type":"ContainerStarted","Data":"2540922514c83c39521e3ce1f8ce9712ed061b419ae3e06e435e2cd0aa8916e5"} Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.176658 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hd5b" event={"ID":"96a8b728-0425-424c-af4f-5c9a3c20ff4c","Type":"ContainerStarted","Data":"c08f2d101df14fb2ead5cfa2de45483a81ababc13d5834317eb865f2e997fcac"} Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.178684 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46wjh" event={"ID":"052deadf-ed22-4688-a6fe-0b1039308499","Type":"ContainerStarted","Data":"e0fc2e99cfce81547f7aa4024eeac93e98af8679ecad4fd1fb8a920737352989"} Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.215879 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qsq8z" podStartSLOduration=2.16689793 podStartE2EDuration="48.215859541s" podCreationTimestamp="2025-11-25 16:01:11 +0000 UTC" firstStartedPulling="2025-11-25 16:01:12.638983957 +0000 UTC m=+151.760823506" lastFinishedPulling="2025-11-25 16:01:58.687945568 +0000 UTC m=+197.809785117" observedRunningTime="2025-11-25 16:01:59.215463681 +0000 UTC m=+198.337303250" watchObservedRunningTime="2025-11-25 16:01:59.215859541 +0000 UTC m=+198.337699090" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.270543 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4hd5b" podStartSLOduration=2.133137261 podStartE2EDuration="48.270518944s" podCreationTimestamp="2025-11-25 16:01:11 +0000 UTC" firstStartedPulling="2025-11-25 16:01:12.692189828 +0000 UTC m=+151.814029367" lastFinishedPulling="2025-11-25 16:01:58.829571501 +0000 UTC m=+197.951411050" observedRunningTime="2025-11-25 16:01:59.258245487 +0000 UTC m=+198.380085056" watchObservedRunningTime="2025-11-25 16:01:59.270518944 +0000 UTC m=+198.392358493" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.322523 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.323440 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.324555 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mg9h4" podStartSLOduration=3.473279995 podStartE2EDuration="48.324538881s" podCreationTimestamp="2025-11-25 16:01:11 +0000 UTC" firstStartedPulling="2025-11-25 16:01:13.745669698 +0000 UTC m=+152.867509247" lastFinishedPulling="2025-11-25 16:01:58.596928584 +0000 UTC m=+197.718768133" observedRunningTime="2025-11-25 16:01:59.321036051 +0000 UTC m=+198.442875620" watchObservedRunningTime="2025-11-25 16:01:59.324538881 +0000 UTC m=+198.446378430" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.354400 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.384039 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-46wjh" podStartSLOduration=2.155161168 podStartE2EDuration="48.38402042s" podCreationTimestamp="2025-11-25 16:01:11 +0000 UTC" firstStartedPulling="2025-11-25 16:01:12.729260363 +0000 UTC m=+151.851099912" lastFinishedPulling="2025-11-25 16:01:58.958119615 +0000 UTC m=+198.079959164" observedRunningTime="2025-11-25 16:01:59.361951479 +0000 UTC m=+198.483791028" watchObservedRunningTime="2025-11-25 16:01:59.38402042 +0000 UTC m=+198.505859969" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.447789 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40d489a4-a435-45ed-b549-aec8103bb098-kubelet-dir\") pod \"installer-9-crc\" (UID: \"40d489a4-a435-45ed-b549-aec8103bb098\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.447864 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40d489a4-a435-45ed-b549-aec8103bb098-kube-api-access\") pod \"installer-9-crc\" (UID: \"40d489a4-a435-45ed-b549-aec8103bb098\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.447910 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/40d489a4-a435-45ed-b549-aec8103bb098-var-lock\") pod \"installer-9-crc\" (UID: \"40d489a4-a435-45ed-b549-aec8103bb098\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.495863 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.549435 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bfce331-5ee3-49fd-a4cd-53e4b1b93367-kubelet-dir\") pod \"9bfce331-5ee3-49fd-a4cd-53e4b1b93367\" (UID: \"9bfce331-5ee3-49fd-a4cd-53e4b1b93367\") " Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.549564 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bfce331-5ee3-49fd-a4cd-53e4b1b93367-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9bfce331-5ee3-49fd-a4cd-53e4b1b93367" (UID: "9bfce331-5ee3-49fd-a4cd-53e4b1b93367"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.549669 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9bfce331-5ee3-49fd-a4cd-53e4b1b93367-kube-api-access\") pod \"9bfce331-5ee3-49fd-a4cd-53e4b1b93367\" (UID: \"9bfce331-5ee3-49fd-a4cd-53e4b1b93367\") " Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.549961 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40d489a4-a435-45ed-b549-aec8103bb098-kube-api-access\") pod \"installer-9-crc\" (UID: \"40d489a4-a435-45ed-b549-aec8103bb098\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.550018 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/40d489a4-a435-45ed-b549-aec8103bb098-var-lock\") pod \"installer-9-crc\" (UID: \"40d489a4-a435-45ed-b549-aec8103bb098\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.550102 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40d489a4-a435-45ed-b549-aec8103bb098-kubelet-dir\") pod \"installer-9-crc\" (UID: \"40d489a4-a435-45ed-b549-aec8103bb098\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.550155 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bfce331-5ee3-49fd-a4cd-53e4b1b93367-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.550212 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40d489a4-a435-45ed-b549-aec8103bb098-kubelet-dir\") pod \"installer-9-crc\" (UID: \"40d489a4-a435-45ed-b549-aec8103bb098\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.550753 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/40d489a4-a435-45ed-b549-aec8103bb098-var-lock\") pod \"installer-9-crc\" (UID: \"40d489a4-a435-45ed-b549-aec8103bb098\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.560781 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bfce331-5ee3-49fd-a4cd-53e4b1b93367-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9bfce331-5ee3-49fd-a4cd-53e4b1b93367" (UID: "9bfce331-5ee3-49fd-a4cd-53e4b1b93367"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.575715 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40d489a4-a435-45ed-b549-aec8103bb098-kube-api-access\") pod \"installer-9-crc\" (UID: \"40d489a4-a435-45ed-b549-aec8103bb098\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.651221 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9bfce331-5ee3-49fd-a4cd-53e4b1b93367-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 16:01:59 crc kubenswrapper[4743]: I1125 16:01:59.663312 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 16:02:00 crc kubenswrapper[4743]: I1125 16:02:00.063389 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 25 16:02:00 crc kubenswrapper[4743]: W1125 16:02:00.069731 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod40d489a4_a435_45ed_b549_aec8103bb098.slice/crio-fdf35b3eabc202f2ef48203a11b5743ed707062af4247895d5d89a36c0444fbe WatchSource:0}: Error finding container fdf35b3eabc202f2ef48203a11b5743ed707062af4247895d5d89a36c0444fbe: Status 404 returned error can't find the container with id fdf35b3eabc202f2ef48203a11b5743ed707062af4247895d5d89a36c0444fbe Nov 25 16:02:00 crc kubenswrapper[4743]: I1125 16:02:00.183879 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"40d489a4-a435-45ed-b549-aec8103bb098","Type":"ContainerStarted","Data":"fdf35b3eabc202f2ef48203a11b5743ed707062af4247895d5d89a36c0444fbe"} Nov 25 16:02:00 crc kubenswrapper[4743]: I1125 16:02:00.186071 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9bfce331-5ee3-49fd-a4cd-53e4b1b93367","Type":"ContainerDied","Data":"76913c0ecb1df776e862391eccd948585def6d34ed6b3986559712adcb2c2988"} Nov 25 16:02:00 crc kubenswrapper[4743]: I1125 16:02:00.186112 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76913c0ecb1df776e862391eccd948585def6d34ed6b3986559712adcb2c2988" Nov 25 16:02:00 crc kubenswrapper[4743]: I1125 16:02:00.186141 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 16:02:01 crc kubenswrapper[4743]: I1125 16:02:01.192303 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"40d489a4-a435-45ed-b549-aec8103bb098","Type":"ContainerStarted","Data":"8ee6a6213a4ba9f12ba0798d5596754e90db36e6b601375337fef9f01976c6e4"} Nov 25 16:02:01 crc kubenswrapper[4743]: I1125 16:02:01.208685 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.208663578 podStartE2EDuration="2.208663578s" podCreationTimestamp="2025-11-25 16:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:02:01.206984325 +0000 UTC m=+200.328823904" watchObservedRunningTime="2025-11-25 16:02:01.208663578 +0000 UTC m=+200.330503137" Nov 25 16:02:01 crc kubenswrapper[4743]: I1125 16:02:01.502535 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:02:01 crc kubenswrapper[4743]: I1125 16:02:01.502972 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:02:01 crc kubenswrapper[4743]: I1125 16:02:01.640020 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:02:01 crc kubenswrapper[4743]: I1125 16:02:01.647703 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:02:01 crc kubenswrapper[4743]: I1125 16:02:01.647752 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:02:01 crc kubenswrapper[4743]: I1125 16:02:01.693295 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:02:01 crc kubenswrapper[4743]: I1125 16:02:01.822048 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:02:01 crc kubenswrapper[4743]: I1125 16:02:01.822108 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:02:01 crc kubenswrapper[4743]: I1125 16:02:01.860778 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:02:02 crc kubenswrapper[4743]: I1125 16:02:02.080903 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:02:02 crc kubenswrapper[4743]: I1125 16:02:02.081317 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:02:02 crc kubenswrapper[4743]: I1125 16:02:02.119401 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:02:03 crc kubenswrapper[4743]: I1125 16:02:03.243639 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:02:05 crc kubenswrapper[4743]: I1125 16:02:05.372098 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mg9h4"] Nov 25 16:02:05 crc kubenswrapper[4743]: I1125 16:02:05.373019 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mg9h4" podUID="23344e84-51b1-4a70-8e06-77aa57659f16" containerName="registry-server" containerID="cri-o://2540922514c83c39521e3ce1f8ce9712ed061b419ae3e06e435e2cd0aa8916e5" gracePeriod=2 Nov 25 16:02:05 crc kubenswrapper[4743]: I1125 16:02:05.716405 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:02:05 crc kubenswrapper[4743]: I1125 16:02:05.841468 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nf5d\" (UniqueName: \"kubernetes.io/projected/23344e84-51b1-4a70-8e06-77aa57659f16-kube-api-access-9nf5d\") pod \"23344e84-51b1-4a70-8e06-77aa57659f16\" (UID: \"23344e84-51b1-4a70-8e06-77aa57659f16\") " Nov 25 16:02:05 crc kubenswrapper[4743]: I1125 16:02:05.842033 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23344e84-51b1-4a70-8e06-77aa57659f16-catalog-content\") pod \"23344e84-51b1-4a70-8e06-77aa57659f16\" (UID: \"23344e84-51b1-4a70-8e06-77aa57659f16\") " Nov 25 16:02:05 crc kubenswrapper[4743]: I1125 16:02:05.842084 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23344e84-51b1-4a70-8e06-77aa57659f16-utilities\") pod \"23344e84-51b1-4a70-8e06-77aa57659f16\" (UID: \"23344e84-51b1-4a70-8e06-77aa57659f16\") " Nov 25 16:02:05 crc kubenswrapper[4743]: I1125 16:02:05.842975 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23344e84-51b1-4a70-8e06-77aa57659f16-utilities" (OuterVolumeSpecName: "utilities") pod "23344e84-51b1-4a70-8e06-77aa57659f16" (UID: "23344e84-51b1-4a70-8e06-77aa57659f16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:05 crc kubenswrapper[4743]: I1125 16:02:05.860727 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23344e84-51b1-4a70-8e06-77aa57659f16-kube-api-access-9nf5d" (OuterVolumeSpecName: "kube-api-access-9nf5d") pod "23344e84-51b1-4a70-8e06-77aa57659f16" (UID: "23344e84-51b1-4a70-8e06-77aa57659f16"). InnerVolumeSpecName "kube-api-access-9nf5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:02:05 crc kubenswrapper[4743]: I1125 16:02:05.903459 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23344e84-51b1-4a70-8e06-77aa57659f16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23344e84-51b1-4a70-8e06-77aa57659f16" (UID: "23344e84-51b1-4a70-8e06-77aa57659f16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:05 crc kubenswrapper[4743]: I1125 16:02:05.943727 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nf5d\" (UniqueName: \"kubernetes.io/projected/23344e84-51b1-4a70-8e06-77aa57659f16-kube-api-access-9nf5d\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:05 crc kubenswrapper[4743]: I1125 16:02:05.943783 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23344e84-51b1-4a70-8e06-77aa57659f16-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:05 crc kubenswrapper[4743]: I1125 16:02:05.943797 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23344e84-51b1-4a70-8e06-77aa57659f16-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:06 crc kubenswrapper[4743]: I1125 16:02:06.239297 4743 generic.go:334] "Generic (PLEG): container finished" podID="23344e84-51b1-4a70-8e06-77aa57659f16" containerID="2540922514c83c39521e3ce1f8ce9712ed061b419ae3e06e435e2cd0aa8916e5" exitCode=0 Nov 25 16:02:06 crc kubenswrapper[4743]: I1125 16:02:06.239360 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mg9h4" Nov 25 16:02:06 crc kubenswrapper[4743]: I1125 16:02:06.239375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg9h4" event={"ID":"23344e84-51b1-4a70-8e06-77aa57659f16","Type":"ContainerDied","Data":"2540922514c83c39521e3ce1f8ce9712ed061b419ae3e06e435e2cd0aa8916e5"} Nov 25 16:02:06 crc kubenswrapper[4743]: I1125 16:02:06.239416 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg9h4" event={"ID":"23344e84-51b1-4a70-8e06-77aa57659f16","Type":"ContainerDied","Data":"a2120ac9c7dd45ea9c10a4a05c61003fe07684d730b60ba82e761255a77d9ba1"} Nov 25 16:02:06 crc kubenswrapper[4743]: I1125 16:02:06.239437 4743 scope.go:117] "RemoveContainer" containerID="2540922514c83c39521e3ce1f8ce9712ed061b419ae3e06e435e2cd0aa8916e5" Nov 25 16:02:06 crc kubenswrapper[4743]: I1125 16:02:06.256978 4743 scope.go:117] "RemoveContainer" containerID="ff71d6f9dc57936b224de21a233ba2af5d8843d7ef3394131f50dba9965176a4" Nov 25 16:02:06 crc kubenswrapper[4743]: I1125 16:02:06.274361 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mg9h4"] Nov 25 16:02:06 crc kubenswrapper[4743]: I1125 16:02:06.278396 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mg9h4"] Nov 25 16:02:06 crc kubenswrapper[4743]: I1125 16:02:06.299892 4743 scope.go:117] "RemoveContainer" containerID="45c6fe954cb22a5d5734bd6d30d746e912b062f74f28841f6dc31a11714b31b3" Nov 25 16:02:06 crc kubenswrapper[4743]: I1125 16:02:06.314200 4743 scope.go:117] "RemoveContainer" containerID="2540922514c83c39521e3ce1f8ce9712ed061b419ae3e06e435e2cd0aa8916e5" Nov 25 16:02:06 crc kubenswrapper[4743]: E1125 16:02:06.314921 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2540922514c83c39521e3ce1f8ce9712ed061b419ae3e06e435e2cd0aa8916e5\": container with ID starting with 2540922514c83c39521e3ce1f8ce9712ed061b419ae3e06e435e2cd0aa8916e5 not found: ID does not exist" containerID="2540922514c83c39521e3ce1f8ce9712ed061b419ae3e06e435e2cd0aa8916e5" Nov 25 16:02:06 crc kubenswrapper[4743]: I1125 16:02:06.314970 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2540922514c83c39521e3ce1f8ce9712ed061b419ae3e06e435e2cd0aa8916e5"} err="failed to get container status \"2540922514c83c39521e3ce1f8ce9712ed061b419ae3e06e435e2cd0aa8916e5\": rpc error: code = NotFound desc = could not find container \"2540922514c83c39521e3ce1f8ce9712ed061b419ae3e06e435e2cd0aa8916e5\": container with ID starting with 2540922514c83c39521e3ce1f8ce9712ed061b419ae3e06e435e2cd0aa8916e5 not found: ID does not exist" Nov 25 16:02:06 crc kubenswrapper[4743]: I1125 16:02:06.315026 4743 scope.go:117] "RemoveContainer" containerID="ff71d6f9dc57936b224de21a233ba2af5d8843d7ef3394131f50dba9965176a4" Nov 25 16:02:06 crc kubenswrapper[4743]: E1125 16:02:06.315399 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff71d6f9dc57936b224de21a233ba2af5d8843d7ef3394131f50dba9965176a4\": container with ID starting with ff71d6f9dc57936b224de21a233ba2af5d8843d7ef3394131f50dba9965176a4 not found: ID does not exist" containerID="ff71d6f9dc57936b224de21a233ba2af5d8843d7ef3394131f50dba9965176a4" Nov 25 16:02:06 crc kubenswrapper[4743]: I1125 16:02:06.315428 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff71d6f9dc57936b224de21a233ba2af5d8843d7ef3394131f50dba9965176a4"} err="failed to get container status \"ff71d6f9dc57936b224de21a233ba2af5d8843d7ef3394131f50dba9965176a4\": rpc error: code = NotFound desc = could not find container \"ff71d6f9dc57936b224de21a233ba2af5d8843d7ef3394131f50dba9965176a4\": container with ID starting with ff71d6f9dc57936b224de21a233ba2af5d8843d7ef3394131f50dba9965176a4 not found: ID does not exist" Nov 25 16:02:06 crc kubenswrapper[4743]: I1125 16:02:06.315446 4743 scope.go:117] "RemoveContainer" containerID="45c6fe954cb22a5d5734bd6d30d746e912b062f74f28841f6dc31a11714b31b3" Nov 25 16:02:06 crc kubenswrapper[4743]: E1125 16:02:06.315731 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45c6fe954cb22a5d5734bd6d30d746e912b062f74f28841f6dc31a11714b31b3\": container with ID starting with 45c6fe954cb22a5d5734bd6d30d746e912b062f74f28841f6dc31a11714b31b3 not found: ID does not exist" containerID="45c6fe954cb22a5d5734bd6d30d746e912b062f74f28841f6dc31a11714b31b3" Nov 25 16:02:06 crc kubenswrapper[4743]: I1125 16:02:06.315758 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45c6fe954cb22a5d5734bd6d30d746e912b062f74f28841f6dc31a11714b31b3"} err="failed to get container status \"45c6fe954cb22a5d5734bd6d30d746e912b062f74f28841f6dc31a11714b31b3\": rpc error: code = NotFound desc = could not find container \"45c6fe954cb22a5d5734bd6d30d746e912b062f74f28841f6dc31a11714b31b3\": container with ID starting with 45c6fe954cb22a5d5734bd6d30d746e912b062f74f28841f6dc31a11714b31b3 not found: ID does not exist" Nov 25 16:02:07 crc kubenswrapper[4743]: I1125 16:02:07.782916 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23344e84-51b1-4a70-8e06-77aa57659f16" path="/var/lib/kubelet/pods/23344e84-51b1-4a70-8e06-77aa57659f16/volumes" Nov 25 16:02:08 crc kubenswrapper[4743]: I1125 16:02:08.261963 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kw4x5" event={"ID":"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df","Type":"ContainerStarted","Data":"3a428dfc71a42b6c7329af0be4c379438a75948e46bacfef12ce4abd6672eb92"} Nov 25 16:02:09 crc kubenswrapper[4743]: I1125 16:02:09.270113 4743 generic.go:334] "Generic (PLEG): container finished" podID="ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" containerID="3a428dfc71a42b6c7329af0be4c379438a75948e46bacfef12ce4abd6672eb92" exitCode=0 Nov 25 16:02:09 crc kubenswrapper[4743]: I1125 16:02:09.270179 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kw4x5" event={"ID":"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df","Type":"ContainerDied","Data":"3a428dfc71a42b6c7329af0be4c379438a75948e46bacfef12ce4abd6672eb92"} Nov 25 16:02:10 crc kubenswrapper[4743]: I1125 16:02:10.276218 4743 generic.go:334] "Generic (PLEG): container finished" podID="8897808a-0b5d-4d3b-b512-652b62458b9e" containerID="7db7089c06e955fec1f8c6ee8416a1890c2d7f0602ec7b6b1aafbf1ec4a48333" exitCode=0 Nov 25 16:02:10 crc kubenswrapper[4743]: I1125 16:02:10.276294 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzbnf" event={"ID":"8897808a-0b5d-4d3b-b512-652b62458b9e","Type":"ContainerDied","Data":"7db7089c06e955fec1f8c6ee8416a1890c2d7f0602ec7b6b1aafbf1ec4a48333"} Nov 25 16:02:10 crc kubenswrapper[4743]: I1125 16:02:10.278637 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kw4x5" event={"ID":"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df","Type":"ContainerStarted","Data":"d8d9f90380fc2c9a32125c51b79fe87ec9fe39449dc4b1fb61435c8d6dbad1eb"} Nov 25 16:02:10 crc kubenswrapper[4743]: I1125 16:02:10.307734 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kw4x5" podStartSLOduration=2.155258575 podStartE2EDuration="57.307715363s" podCreationTimestamp="2025-11-25 16:01:13 +0000 UTC" firstStartedPulling="2025-11-25 16:01:14.78966748 +0000 UTC m=+153.911507029" lastFinishedPulling="2025-11-25 16:02:09.942124268 +0000 UTC m=+209.063963817" observedRunningTime="2025-11-25 16:02:10.306289206 +0000 UTC m=+209.428128765" watchObservedRunningTime="2025-11-25 16:02:10.307715363 +0000 UTC m=+209.429554912" Nov 25 16:02:11 crc kubenswrapper[4743]: I1125 16:02:11.285110 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzbnf" event={"ID":"8897808a-0b5d-4d3b-b512-652b62458b9e","Type":"ContainerStarted","Data":"8a3b9ef2f82bad6f70623d3b7b0841cabf4453d54e80bb811da4baeb8b5866cb"} Nov 25 16:02:11 crc kubenswrapper[4743]: I1125 16:02:11.288278 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djxzs" event={"ID":"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24","Type":"ContainerStarted","Data":"34645ddc3a2c0f27af4b13786b8fdc20509648d547692b30627846198603d6c1"} Nov 25 16:02:11 crc kubenswrapper[4743]: I1125 16:02:11.290640 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rfdr" event={"ID":"11a83874-5ff8-4255-b12b-d25ee3e9f5f4","Type":"ContainerStarted","Data":"4eafcf54262a2945befe2d7a6c8061ad0dd96e35110307bf54f7ba847a959412"} Nov 25 16:02:11 crc kubenswrapper[4743]: I1125 16:02:11.302649 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vzbnf" podStartSLOduration=2.107814363 podStartE2EDuration="58.302574631s" podCreationTimestamp="2025-11-25 16:01:13 +0000 UTC" firstStartedPulling="2025-11-25 16:01:14.79676413 +0000 UTC m=+153.918603679" lastFinishedPulling="2025-11-25 16:02:10.991524398 +0000 UTC m=+210.113363947" observedRunningTime="2025-11-25 16:02:11.301128855 +0000 UTC m=+210.422968404" watchObservedRunningTime="2025-11-25 16:02:11.302574631 +0000 UTC m=+210.424414180" Nov 25 16:02:11 crc kubenswrapper[4743]: I1125 16:02:11.544823 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:02:11 crc kubenswrapper[4743]: I1125 16:02:11.690324 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:02:11 crc kubenswrapper[4743]: I1125 16:02:11.863211 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:02:12 crc kubenswrapper[4743]: I1125 16:02:12.300893 4743 generic.go:334] "Generic (PLEG): container finished" podID="518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" containerID="34645ddc3a2c0f27af4b13786b8fdc20509648d547692b30627846198603d6c1" exitCode=0 Nov 25 16:02:12 crc kubenswrapper[4743]: I1125 16:02:12.300960 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djxzs" event={"ID":"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24","Type":"ContainerDied","Data":"34645ddc3a2c0f27af4b13786b8fdc20509648d547692b30627846198603d6c1"} Nov 25 16:02:12 crc kubenswrapper[4743]: I1125 16:02:12.302853 4743 generic.go:334] "Generic (PLEG): container finished" podID="11a83874-5ff8-4255-b12b-d25ee3e9f5f4" containerID="4eafcf54262a2945befe2d7a6c8061ad0dd96e35110307bf54f7ba847a959412" exitCode=0 Nov 25 16:02:12 crc kubenswrapper[4743]: I1125 16:02:12.302881 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rfdr" event={"ID":"11a83874-5ff8-4255-b12b-d25ee3e9f5f4","Type":"ContainerDied","Data":"4eafcf54262a2945befe2d7a6c8061ad0dd96e35110307bf54f7ba847a959412"} Nov 25 16:02:13 crc kubenswrapper[4743]: I1125 16:02:13.310413 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djxzs" event={"ID":"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24","Type":"ContainerStarted","Data":"5a80d39460535245ce9d480867f2ac6c6612138f9a1a90d575865315717e32aa"} Nov 25 16:02:13 crc kubenswrapper[4743]: I1125 16:02:13.324116 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rfdr" event={"ID":"11a83874-5ff8-4255-b12b-d25ee3e9f5f4","Type":"ContainerStarted","Data":"17fb8bffe366bda42499c00898393e97dbcae82282aa7a71c88c6480086d0aca"} Nov 25 16:02:13 crc kubenswrapper[4743]: I1125 16:02:13.335315 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-djxzs" podStartSLOduration=3.458666387 podStartE2EDuration="59.335297181s" podCreationTimestamp="2025-11-25 16:01:14 +0000 UTC" firstStartedPulling="2025-11-25 16:01:16.842135843 +0000 UTC m=+155.963975392" lastFinishedPulling="2025-11-25 16:02:12.718766637 +0000 UTC m=+211.840606186" observedRunningTime="2025-11-25 16:02:13.334983964 +0000 UTC m=+212.456823523" watchObservedRunningTime="2025-11-25 16:02:13.335297181 +0000 UTC m=+212.457136730" Nov 25 16:02:13 crc kubenswrapper[4743]: I1125 16:02:13.368525 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7rfdr" podStartSLOduration=3.515503146 podStartE2EDuration="59.368507241s" podCreationTimestamp="2025-11-25 16:01:14 +0000 UTC" firstStartedPulling="2025-11-25 16:01:16.845713153 +0000 UTC m=+155.967552702" lastFinishedPulling="2025-11-25 16:02:12.698717238 +0000 UTC m=+211.820556797" observedRunningTime="2025-11-25 16:02:13.36658386 +0000 UTC m=+212.488423419" watchObservedRunningTime="2025-11-25 16:02:13.368507241 +0000 UTC m=+212.490346790" Nov 25 16:02:13 crc kubenswrapper[4743]: I1125 16:02:13.414853 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:02:13 crc kubenswrapper[4743]: I1125 16:02:13.414921 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:02:13 crc kubenswrapper[4743]: I1125 16:02:13.470558 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:02:13 crc kubenswrapper[4743]: I1125 16:02:13.863045 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:02:13 crc kubenswrapper[4743]: I1125 16:02:13.863095 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:02:13 crc kubenswrapper[4743]: I1125 16:02:13.900916 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.171242 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4hd5b"] Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.171508 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4hd5b" podUID="96a8b728-0425-424c-af4f-5c9a3c20ff4c" containerName="registry-server" containerID="cri-o://c08f2d101df14fb2ead5cfa2de45483a81ababc13d5834317eb865f2e997fcac" gracePeriod=2 Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.336389 4743 generic.go:334] "Generic (PLEG): container finished" podID="96a8b728-0425-424c-af4f-5c9a3c20ff4c" containerID="c08f2d101df14fb2ead5cfa2de45483a81ababc13d5834317eb865f2e997fcac" exitCode=0 Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.336478 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hd5b" event={"ID":"96a8b728-0425-424c-af4f-5c9a3c20ff4c","Type":"ContainerDied","Data":"c08f2d101df14fb2ead5cfa2de45483a81ababc13d5834317eb865f2e997fcac"} Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.521778 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.640137 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.640207 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.655090 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a8b728-0425-424c-af4f-5c9a3c20ff4c-utilities\") pod \"96a8b728-0425-424c-af4f-5c9a3c20ff4c\" (UID: \"96a8b728-0425-424c-af4f-5c9a3c20ff4c\") " Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.655307 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a8b728-0425-424c-af4f-5c9a3c20ff4c-catalog-content\") pod \"96a8b728-0425-424c-af4f-5c9a3c20ff4c\" (UID: \"96a8b728-0425-424c-af4f-5c9a3c20ff4c\") " Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.655340 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6knp8\" (UniqueName: \"kubernetes.io/projected/96a8b728-0425-424c-af4f-5c9a3c20ff4c-kube-api-access-6knp8\") pod \"96a8b728-0425-424c-af4f-5c9a3c20ff4c\" (UID: \"96a8b728-0425-424c-af4f-5c9a3c20ff4c\") " Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.655952 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a8b728-0425-424c-af4f-5c9a3c20ff4c-utilities" (OuterVolumeSpecName: "utilities") pod "96a8b728-0425-424c-af4f-5c9a3c20ff4c" (UID: "96a8b728-0425-424c-af4f-5c9a3c20ff4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.663230 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a8b728-0425-424c-af4f-5c9a3c20ff4c-kube-api-access-6knp8" (OuterVolumeSpecName: "kube-api-access-6knp8") pod "96a8b728-0425-424c-af4f-5c9a3c20ff4c" (UID: "96a8b728-0425-424c-af4f-5c9a3c20ff4c"). InnerVolumeSpecName "kube-api-access-6knp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.712193 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a8b728-0425-424c-af4f-5c9a3c20ff4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96a8b728-0425-424c-af4f-5c9a3c20ff4c" (UID: "96a8b728-0425-424c-af4f-5c9a3c20ff4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.756457 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a8b728-0425-424c-af4f-5c9a3c20ff4c-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.756496 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a8b728-0425-424c-af4f-5c9a3c20ff4c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:14 crc kubenswrapper[4743]: I1125 16:02:14.756508 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6knp8\" (UniqueName: \"kubernetes.io/projected/96a8b728-0425-424c-af4f-5c9a3c20ff4c-kube-api-access-6knp8\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:15 crc kubenswrapper[4743]: I1125 16:02:15.081659 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-djxzs" Nov 25 16:02:15 crc kubenswrapper[4743]: I1125 16:02:15.081748 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-djxzs" Nov 25 16:02:15 crc kubenswrapper[4743]: I1125 16:02:15.343711 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hd5b" event={"ID":"96a8b728-0425-424c-af4f-5c9a3c20ff4c","Type":"ContainerDied","Data":"639614d12ecd9ea94c500015416732074833873978e78b5fcefcb6296fe4a212"} Nov 25 16:02:15 crc kubenswrapper[4743]: I1125 16:02:15.343775 4743 scope.go:117] "RemoveContainer" containerID="c08f2d101df14fb2ead5cfa2de45483a81ababc13d5834317eb865f2e997fcac" Nov 25 16:02:15 crc kubenswrapper[4743]: I1125 16:02:15.343773 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hd5b" Nov 25 16:02:15 crc kubenswrapper[4743]: I1125 16:02:15.361657 4743 scope.go:117] "RemoveContainer" containerID="aa402a519386ac3c3f53bbd8cf17148b3494ade4a657a17789e9691ee2835453" Nov 25 16:02:15 crc kubenswrapper[4743]: I1125 16:02:15.375389 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4hd5b"] Nov 25 16:02:15 crc kubenswrapper[4743]: I1125 16:02:15.382808 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4hd5b"] Nov 25 16:02:15 crc kubenswrapper[4743]: I1125 16:02:15.394928 4743 scope.go:117] "RemoveContainer" containerID="8d034c94ab290f68f50a9639aad1967022acbc412fe3648b17f4b1d89b82a464" Nov 25 16:02:15 crc kubenswrapper[4743]: I1125 16:02:15.687197 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7rfdr" podUID="11a83874-5ff8-4255-b12b-d25ee3e9f5f4" containerName="registry-server" probeResult="failure" output=< Nov 25 16:02:15 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 25 16:02:15 crc kubenswrapper[4743]: > Nov 25 16:02:15 crc kubenswrapper[4743]: I1125 16:02:15.798536 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a8b728-0425-424c-af4f-5c9a3c20ff4c" path="/var/lib/kubelet/pods/96a8b728-0425-424c-af4f-5c9a3c20ff4c/volumes" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.124545 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-djxzs" podUID="518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" containerName="registry-server" probeResult="failure" output=< Nov 25 16:02:16 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 25 16:02:16 crc kubenswrapper[4743]: > Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.265051 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-98w29"] Nov 25 16:02:16 crc kubenswrapper[4743]: E1125 16:02:16.265249 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23344e84-51b1-4a70-8e06-77aa57659f16" containerName="registry-server" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.265261 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="23344e84-51b1-4a70-8e06-77aa57659f16" containerName="registry-server" Nov 25 16:02:16 crc kubenswrapper[4743]: E1125 16:02:16.265299 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a8b728-0425-424c-af4f-5c9a3c20ff4c" containerName="extract-utilities" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.265305 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a8b728-0425-424c-af4f-5c9a3c20ff4c" containerName="extract-utilities" Nov 25 16:02:16 crc kubenswrapper[4743]: E1125 16:02:16.265318 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bfce331-5ee3-49fd-a4cd-53e4b1b93367" containerName="pruner" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.265324 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bfce331-5ee3-49fd-a4cd-53e4b1b93367" containerName="pruner" Nov 25 16:02:16 crc kubenswrapper[4743]: E1125 16:02:16.265333 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a8b728-0425-424c-af4f-5c9a3c20ff4c" containerName="registry-server" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.265340 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a8b728-0425-424c-af4f-5c9a3c20ff4c" containerName="registry-server" Nov 25 16:02:16 crc kubenswrapper[4743]: E1125 16:02:16.265351 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23344e84-51b1-4a70-8e06-77aa57659f16" containerName="extract-content" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.265357 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="23344e84-51b1-4a70-8e06-77aa57659f16" containerName="extract-content" Nov 25 16:02:16 crc kubenswrapper[4743]: E1125 16:02:16.265364 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23344e84-51b1-4a70-8e06-77aa57659f16" containerName="extract-utilities" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.265371 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="23344e84-51b1-4a70-8e06-77aa57659f16" containerName="extract-utilities" Nov 25 16:02:16 crc kubenswrapper[4743]: E1125 16:02:16.265382 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a8b728-0425-424c-af4f-5c9a3c20ff4c" containerName="extract-content" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.265388 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a8b728-0425-424c-af4f-5c9a3c20ff4c" containerName="extract-content" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.265485 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="96a8b728-0425-424c-af4f-5c9a3c20ff4c" containerName="registry-server" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.265498 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bfce331-5ee3-49fd-a4cd-53e4b1b93367" containerName="pruner" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.265507 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="23344e84-51b1-4a70-8e06-77aa57659f16" containerName="registry-server" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.265896 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.315059 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-98w29"] Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.374369 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.374444 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/970b5a75-2a9a-4525-9da2-33846fac733c-registry-certificates\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.374471 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/970b5a75-2a9a-4525-9da2-33846fac733c-bound-sa-token\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.374490 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/970b5a75-2a9a-4525-9da2-33846fac733c-trusted-ca\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.374516 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/970b5a75-2a9a-4525-9da2-33846fac733c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.374550 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/970b5a75-2a9a-4525-9da2-33846fac733c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.374681 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/970b5a75-2a9a-4525-9da2-33846fac733c-registry-tls\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.374737 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2d8q\" (UniqueName: \"kubernetes.io/projected/970b5a75-2a9a-4525-9da2-33846fac733c-kube-api-access-n2d8q\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.412605 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.475797 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/970b5a75-2a9a-4525-9da2-33846fac733c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.475848 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/970b5a75-2a9a-4525-9da2-33846fac733c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.475871 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/970b5a75-2a9a-4525-9da2-33846fac733c-registry-tls\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.475889 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2d8q\" (UniqueName: \"kubernetes.io/projected/970b5a75-2a9a-4525-9da2-33846fac733c-kube-api-access-n2d8q\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.475943 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/970b5a75-2a9a-4525-9da2-33846fac733c-registry-certificates\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.475962 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/970b5a75-2a9a-4525-9da2-33846fac733c-bound-sa-token\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.475980 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/970b5a75-2a9a-4525-9da2-33846fac733c-trusted-ca\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.476659 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/970b5a75-2a9a-4525-9da2-33846fac733c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.477188 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/970b5a75-2a9a-4525-9da2-33846fac733c-trusted-ca\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.478373 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/970b5a75-2a9a-4525-9da2-33846fac733c-registry-certificates\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.481522 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/970b5a75-2a9a-4525-9da2-33846fac733c-registry-tls\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.483206 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/970b5a75-2a9a-4525-9da2-33846fac733c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.491560 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/970b5a75-2a9a-4525-9da2-33846fac733c-bound-sa-token\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.496224 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2d8q\" (UniqueName: \"kubernetes.io/projected/970b5a75-2a9a-4525-9da2-33846fac733c-kube-api-access-n2d8q\") pod \"image-registry-66df7c8f76-98w29\" (UID: \"970b5a75-2a9a-4525-9da2-33846fac733c\") " pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.582097 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:16 crc kubenswrapper[4743]: I1125 16:02:16.994586 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-98w29"] Nov 25 16:02:17 crc kubenswrapper[4743]: W1125 16:02:17.000240 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod970b5a75_2a9a_4525_9da2_33846fac733c.slice/crio-6d9bdf822e04bd020aac0a3a94a9bb56b4a7e2fb5d9415bd47ae663b83b5bd87 WatchSource:0}: Error finding container 6d9bdf822e04bd020aac0a3a94a9bb56b4a7e2fb5d9415bd47ae663b83b5bd87: Status 404 returned error can't find the container with id 6d9bdf822e04bd020aac0a3a94a9bb56b4a7e2fb5d9415bd47ae663b83b5bd87 Nov 25 16:02:17 crc kubenswrapper[4743]: I1125 16:02:17.354958 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-98w29" event={"ID":"970b5a75-2a9a-4525-9da2-33846fac733c","Type":"ContainerStarted","Data":"6d9bdf822e04bd020aac0a3a94a9bb56b4a7e2fb5d9415bd47ae663b83b5bd87"} Nov 25 16:02:18 crc kubenswrapper[4743]: I1125 16:02:18.361251 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-98w29" event={"ID":"970b5a75-2a9a-4525-9da2-33846fac733c","Type":"ContainerStarted","Data":"14953541806a62f7967a649ea6c1cea40fba9473ae5f6247813ce1cddc649877"} Nov 25 16:02:18 crc kubenswrapper[4743]: I1125 16:02:18.361673 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:18 crc kubenswrapper[4743]: I1125 16:02:18.380310 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-98w29" podStartSLOduration=2.380289232 podStartE2EDuration="2.380289232s" podCreationTimestamp="2025-11-25 16:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:02:18.379202845 +0000 UTC m=+217.501042414" watchObservedRunningTime="2025-11-25 16:02:18.380289232 +0000 UTC m=+217.502128781" Nov 25 16:02:20 crc kubenswrapper[4743]: I1125 16:02:20.077493 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:02:20 crc kubenswrapper[4743]: I1125 16:02:20.077557 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:02:20 crc kubenswrapper[4743]: I1125 16:02:20.077621 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 16:02:20 crc kubenswrapper[4743]: I1125 16:02:20.078773 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:02:20 crc kubenswrapper[4743]: I1125 16:02:20.078864 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c" gracePeriod=600 Nov 25 16:02:20 crc kubenswrapper[4743]: I1125 16:02:20.373872 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c" exitCode=0 Nov 25 16:02:20 crc kubenswrapper[4743]: I1125 16:02:20.373919 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c"} Nov 25 16:02:20 crc kubenswrapper[4743]: I1125 16:02:20.373965 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"15aa768991dd9857b9f4f50b7025dcc102ea45bed548518b0bfc1c7f52a875e4"} Nov 25 16:02:21 crc kubenswrapper[4743]: I1125 16:02:21.677173 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-w8pxz"] Nov 25 16:02:23 crc kubenswrapper[4743]: I1125 16:02:23.454733 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:02:23 crc kubenswrapper[4743]: I1125 16:02:23.901763 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:02:23 crc kubenswrapper[4743]: I1125 16:02:23.940125 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kw4x5"] Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.394777 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kw4x5" podUID="ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" containerName="registry-server" containerID="cri-o://d8d9f90380fc2c9a32125c51b79fe87ec9fe39449dc4b1fb61435c8d6dbad1eb" gracePeriod=2 Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.678315 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.715992 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.786960 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-46wjh"] Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.787168 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-46wjh" podUID="052deadf-ed22-4688-a6fe-0b1039308499" containerName="registry-server" containerID="cri-o://e0fc2e99cfce81547f7aa4024eeac93e98af8679ecad4fd1fb8a920737352989" gracePeriod=30 Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.795840 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qsq8z"] Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.796357 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qsq8z" podUID="a3cb9b00-75c4-4da6-b6f6-8b92726febae" containerName="registry-server" containerID="cri-o://1eb07f805fe0baf72e7c20b7d6f1b12fc8ac0e69307c5507d975de7dd4e19286" gracePeriod=30 Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.802851 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pbz24"] Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.803057 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" podUID="baeafcba-9592-4136-9893-4bf3b9295041" containerName="marketplace-operator" containerID="cri-o://2d385007af954b65673249bd7d784298ec67db397fffe5fe87f6f14afe115541" gracePeriod=30 Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.814724 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r4ckx"] Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.815487 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.817093 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vzbnf"] Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.817317 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vzbnf" podUID="8897808a-0b5d-4d3b-b512-652b62458b9e" containerName="registry-server" containerID="cri-o://8a3b9ef2f82bad6f70623d3b7b0841cabf4453d54e80bb811da4baeb8b5866cb" gracePeriod=30 Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.825792 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7rfdr"] Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.829737 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r4ckx"] Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.837251 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-djxzs"] Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.837463 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-djxzs" podUID="518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" containerName="registry-server" containerID="cri-o://5a80d39460535245ce9d480867f2ac6c6612138f9a1a90d575865315717e32aa" gracePeriod=30 Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.888730 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/84c01433-ed4b-4b70-8473-7905b701f657-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r4ckx\" (UID: \"84c01433-ed4b-4b70-8473-7905b701f657\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.888802 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49psf\" (UniqueName: \"kubernetes.io/projected/84c01433-ed4b-4b70-8473-7905b701f657-kube-api-access-49psf\") pod \"marketplace-operator-79b997595-r4ckx\" (UID: \"84c01433-ed4b-4b70-8473-7905b701f657\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.888825 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84c01433-ed4b-4b70-8473-7905b701f657-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r4ckx\" (UID: \"84c01433-ed4b-4b70-8473-7905b701f657\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.990348 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/84c01433-ed4b-4b70-8473-7905b701f657-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r4ckx\" (UID: \"84c01433-ed4b-4b70-8473-7905b701f657\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.990447 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49psf\" (UniqueName: \"kubernetes.io/projected/84c01433-ed4b-4b70-8473-7905b701f657-kube-api-access-49psf\") pod \"marketplace-operator-79b997595-r4ckx\" (UID: \"84c01433-ed4b-4b70-8473-7905b701f657\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.990479 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84c01433-ed4b-4b70-8473-7905b701f657-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r4ckx\" (UID: \"84c01433-ed4b-4b70-8473-7905b701f657\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.991744 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84c01433-ed4b-4b70-8473-7905b701f657-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r4ckx\" (UID: \"84c01433-ed4b-4b70-8473-7905b701f657\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" Nov 25 16:02:24 crc kubenswrapper[4743]: I1125 16:02:24.997571 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/84c01433-ed4b-4b70-8473-7905b701f657-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r4ckx\" (UID: \"84c01433-ed4b-4b70-8473-7905b701f657\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" Nov 25 16:02:25 crc kubenswrapper[4743]: I1125 16:02:25.015430 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49psf\" (UniqueName: \"kubernetes.io/projected/84c01433-ed4b-4b70-8473-7905b701f657-kube-api-access-49psf\") pod \"marketplace-operator-79b997595-r4ckx\" (UID: \"84c01433-ed4b-4b70-8473-7905b701f657\") " pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" Nov 25 16:02:25 crc kubenswrapper[4743]: I1125 16:02:25.134043 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" Nov 25 16:02:25 crc kubenswrapper[4743]: I1125 16:02:25.401904 4743 generic.go:334] "Generic (PLEG): container finished" podID="a3cb9b00-75c4-4da6-b6f6-8b92726febae" containerID="1eb07f805fe0baf72e7c20b7d6f1b12fc8ac0e69307c5507d975de7dd4e19286" exitCode=0 Nov 25 16:02:25 crc kubenswrapper[4743]: I1125 16:02:25.402008 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qsq8z" event={"ID":"a3cb9b00-75c4-4da6-b6f6-8b92726febae","Type":"ContainerDied","Data":"1eb07f805fe0baf72e7c20b7d6f1b12fc8ac0e69307c5507d975de7dd4e19286"} Nov 25 16:02:25 crc kubenswrapper[4743]: I1125 16:02:25.404758 4743 generic.go:334] "Generic (PLEG): container finished" podID="ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" containerID="d8d9f90380fc2c9a32125c51b79fe87ec9fe39449dc4b1fb61435c8d6dbad1eb" exitCode=0 Nov 25 16:02:25 crc kubenswrapper[4743]: I1125 16:02:25.404846 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kw4x5" event={"ID":"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df","Type":"ContainerDied","Data":"d8d9f90380fc2c9a32125c51b79fe87ec9fe39449dc4b1fb61435c8d6dbad1eb"} Nov 25 16:02:25 crc kubenswrapper[4743]: I1125 16:02:25.405975 4743 generic.go:334] "Generic (PLEG): container finished" podID="baeafcba-9592-4136-9893-4bf3b9295041" containerID="2d385007af954b65673249bd7d784298ec67db397fffe5fe87f6f14afe115541" exitCode=0 Nov 25 16:02:25 crc kubenswrapper[4743]: I1125 16:02:25.406066 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" event={"ID":"baeafcba-9592-4136-9893-4bf3b9295041","Type":"ContainerDied","Data":"2d385007af954b65673249bd7d784298ec67db397fffe5fe87f6f14afe115541"} Nov 25 16:02:25 crc kubenswrapper[4743]: I1125 16:02:25.526380 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r4ckx"] Nov 25 16:02:25 crc kubenswrapper[4743]: W1125 16:02:25.549933 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84c01433_ed4b_4b70_8473_7905b701f657.slice/crio-d5f0118a79542dd7b908974bb4a1e9663db1d9389b873d5ad95dd8be1ccfbee8 WatchSource:0}: Error finding container d5f0118a79542dd7b908974bb4a1e9663db1d9389b873d5ad95dd8be1ccfbee8: Status 404 returned error can't find the container with id d5f0118a79542dd7b908974bb4a1e9663db1d9389b873d5ad95dd8be1ccfbee8 Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.162017 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.207995 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-trusted-ca\") pod \"baeafcba-9592-4136-9893-4bf3b9295041\" (UID: \"baeafcba-9592-4136-9893-4bf3b9295041\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.208041 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-operator-metrics\") pod \"baeafcba-9592-4136-9893-4bf3b9295041\" (UID: \"baeafcba-9592-4136-9893-4bf3b9295041\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.208074 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nghlb\" (UniqueName: \"kubernetes.io/projected/baeafcba-9592-4136-9893-4bf3b9295041-kube-api-access-nghlb\") pod \"baeafcba-9592-4136-9893-4bf3b9295041\" (UID: \"baeafcba-9592-4136-9893-4bf3b9295041\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.208942 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "baeafcba-9592-4136-9893-4bf3b9295041" (UID: "baeafcba-9592-4136-9893-4bf3b9295041"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.214030 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "baeafcba-9592-4136-9893-4bf3b9295041" (UID: "baeafcba-9592-4136-9893-4bf3b9295041"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.222105 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baeafcba-9592-4136-9893-4bf3b9295041-kube-api-access-nghlb" (OuterVolumeSpecName: "kube-api-access-nghlb") pod "baeafcba-9592-4136-9893-4bf3b9295041" (UID: "baeafcba-9592-4136-9893-4bf3b9295041"). InnerVolumeSpecName "kube-api-access-nghlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.237649 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.243720 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.309297 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-catalog-content\") pod \"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df\" (UID: \"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.309411 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjfjb\" (UniqueName: \"kubernetes.io/projected/a3cb9b00-75c4-4da6-b6f6-8b92726febae-kube-api-access-tjfjb\") pod \"a3cb9b00-75c4-4da6-b6f6-8b92726febae\" (UID: \"a3cb9b00-75c4-4da6-b6f6-8b92726febae\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.309462 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cb9b00-75c4-4da6-b6f6-8b92726febae-catalog-content\") pod \"a3cb9b00-75c4-4da6-b6f6-8b92726febae\" (UID: \"a3cb9b00-75c4-4da6-b6f6-8b92726febae\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.309498 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cnvt\" (UniqueName: \"kubernetes.io/projected/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-kube-api-access-2cnvt\") pod \"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df\" (UID: \"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.309551 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cb9b00-75c4-4da6-b6f6-8b92726febae-utilities\") pod \"a3cb9b00-75c4-4da6-b6f6-8b92726febae\" (UID: \"a3cb9b00-75c4-4da6-b6f6-8b92726febae\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.309569 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-utilities\") pod \"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df\" (UID: \"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.309796 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.309810 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/baeafcba-9592-4136-9893-4bf3b9295041-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.309819 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nghlb\" (UniqueName: \"kubernetes.io/projected/baeafcba-9592-4136-9893-4bf3b9295041-kube-api-access-nghlb\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.310709 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-utilities" (OuterVolumeSpecName: "utilities") pod "ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" (UID: "ac1b537d-a3a6-4b63-9fa4-d0b15088c3df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.310782 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3cb9b00-75c4-4da6-b6f6-8b92726febae-utilities" (OuterVolumeSpecName: "utilities") pod "a3cb9b00-75c4-4da6-b6f6-8b92726febae" (UID: "a3cb9b00-75c4-4da6-b6f6-8b92726febae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.313467 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-kube-api-access-2cnvt" (OuterVolumeSpecName: "kube-api-access-2cnvt") pod "ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" (UID: "ac1b537d-a3a6-4b63-9fa4-d0b15088c3df"). InnerVolumeSpecName "kube-api-access-2cnvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.313723 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3cb9b00-75c4-4da6-b6f6-8b92726febae-kube-api-access-tjfjb" (OuterVolumeSpecName: "kube-api-access-tjfjb") pod "a3cb9b00-75c4-4da6-b6f6-8b92726febae" (UID: "a3cb9b00-75c4-4da6-b6f6-8b92726febae"). InnerVolumeSpecName "kube-api-access-tjfjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.333720 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" (UID: "ac1b537d-a3a6-4b63-9fa4-d0b15088c3df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.382550 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3cb9b00-75c4-4da6-b6f6-8b92726febae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3cb9b00-75c4-4da6-b6f6-8b92726febae" (UID: "a3cb9b00-75c4-4da6-b6f6-8b92726febae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.410796 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjfjb\" (UniqueName: \"kubernetes.io/projected/a3cb9b00-75c4-4da6-b6f6-8b92726febae-kube-api-access-tjfjb\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.410834 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cb9b00-75c4-4da6-b6f6-8b92726febae-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.410845 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cnvt\" (UniqueName: \"kubernetes.io/projected/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-kube-api-access-2cnvt\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.410855 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cb9b00-75c4-4da6-b6f6-8b92726febae-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.410864 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.410872 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.415083 4743 generic.go:334] "Generic (PLEG): container finished" podID="518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" containerID="5a80d39460535245ce9d480867f2ac6c6612138f9a1a90d575865315717e32aa" exitCode=0 Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.415168 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djxzs" event={"ID":"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24","Type":"ContainerDied","Data":"5a80d39460535245ce9d480867f2ac6c6612138f9a1a90d575865315717e32aa"} Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.423810 4743 generic.go:334] "Generic (PLEG): container finished" podID="052deadf-ed22-4688-a6fe-0b1039308499" containerID="e0fc2e99cfce81547f7aa4024eeac93e98af8679ecad4fd1fb8a920737352989" exitCode=0 Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.423863 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46wjh" event={"ID":"052deadf-ed22-4688-a6fe-0b1039308499","Type":"ContainerDied","Data":"e0fc2e99cfce81547f7aa4024eeac93e98af8679ecad4fd1fb8a920737352989"} Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.427093 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qsq8z" event={"ID":"a3cb9b00-75c4-4da6-b6f6-8b92726febae","Type":"ContainerDied","Data":"f7d3bbe473fe402d0e96900137b7bdc2c124836c70917213459278ea82906c07"} Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.427140 4743 scope.go:117] "RemoveContainer" containerID="1eb07f805fe0baf72e7c20b7d6f1b12fc8ac0e69307c5507d975de7dd4e19286" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.427150 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qsq8z" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.434517 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kw4x5" event={"ID":"ac1b537d-a3a6-4b63-9fa4-d0b15088c3df","Type":"ContainerDied","Data":"e2c642ddc57bebb1c06764b62b452aac9b6c6d37fdb173a772a03d325d4dd031"} Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.434640 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kw4x5" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.443174 4743 generic.go:334] "Generic (PLEG): container finished" podID="8897808a-0b5d-4d3b-b512-652b62458b9e" containerID="8a3b9ef2f82bad6f70623d3b7b0841cabf4453d54e80bb811da4baeb8b5866cb" exitCode=0 Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.443239 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzbnf" event={"ID":"8897808a-0b5d-4d3b-b512-652b62458b9e","Type":"ContainerDied","Data":"8a3b9ef2f82bad6f70623d3b7b0841cabf4453d54e80bb811da4baeb8b5866cb"} Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.446022 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" event={"ID":"baeafcba-9592-4136-9893-4bf3b9295041","Type":"ContainerDied","Data":"2e803c23206fbb200c00facd5c5fad70f4f1204969966ca692b936825510303a"} Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.446099 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pbz24" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.454473 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7rfdr" podUID="11a83874-5ff8-4255-b12b-d25ee3e9f5f4" containerName="registry-server" containerID="cri-o://17fb8bffe366bda42499c00898393e97dbcae82282aa7a71c88c6480086d0aca" gracePeriod=30 Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.455044 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" event={"ID":"84c01433-ed4b-4b70-8473-7905b701f657","Type":"ContainerStarted","Data":"0359e6a7c13260d4b9a45b27bedca3f73589c085b8ecb3de0b8b48b0c91f0ffc"} Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.455074 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" event={"ID":"84c01433-ed4b-4b70-8473-7905b701f657","Type":"ContainerStarted","Data":"d5f0118a79542dd7b908974bb4a1e9663db1d9389b873d5ad95dd8be1ccfbee8"} Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.457303 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.458556 4743 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-r4ckx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" start-of-body= Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.458613 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" podUID="84c01433-ed4b-4b70-8473-7905b701f657" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.473711 4743 scope.go:117] "RemoveContainer" containerID="226626d725cf1820d7fc9d332b462337d513b9d8fb083001ecddf14c01de9457" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.486899 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" podStartSLOduration=2.486878842 podStartE2EDuration="2.486878842s" podCreationTimestamp="2025-11-25 16:02:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:02:26.486786999 +0000 UTC m=+225.608626568" watchObservedRunningTime="2025-11-25 16:02:26.486878842 +0000 UTC m=+225.608718391" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.514127 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qsq8z"] Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.519339 4743 scope.go:117] "RemoveContainer" containerID="ccf40f4501e44f0d01fd04eab991120fa16e391f4d62ac754fb628ac20f95efa" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.522730 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qsq8z"] Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.526530 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kw4x5"] Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.542379 4743 scope.go:117] "RemoveContainer" containerID="d8d9f90380fc2c9a32125c51b79fe87ec9fe39449dc4b1fb61435c8d6dbad1eb" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.545349 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kw4x5"] Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.557613 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pbz24"] Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.564528 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pbz24"] Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.590209 4743 scope.go:117] "RemoveContainer" containerID="3a428dfc71a42b6c7329af0be4c379438a75948e46bacfef12ce4abd6672eb92" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.661760 4743 scope.go:117] "RemoveContainer" containerID="05f36697cc41613a1b38207d56930956b12608b8205f3cfadcc7b76a1e90e678" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.716465 4743 scope.go:117] "RemoveContainer" containerID="2d385007af954b65673249bd7d784298ec67db397fffe5fe87f6f14afe115541" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.744213 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.754968 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djxzs" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.765329 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.817777 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/052deadf-ed22-4688-a6fe-0b1039308499-utilities\") pod \"052deadf-ed22-4688-a6fe-0b1039308499\" (UID: \"052deadf-ed22-4688-a6fe-0b1039308499\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.818062 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-catalog-content\") pod \"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24\" (UID: \"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.818148 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/052deadf-ed22-4688-a6fe-0b1039308499-catalog-content\") pod \"052deadf-ed22-4688-a6fe-0b1039308499\" (UID: \"052deadf-ed22-4688-a6fe-0b1039308499\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.818254 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krsqz\" (UniqueName: \"kubernetes.io/projected/052deadf-ed22-4688-a6fe-0b1039308499-kube-api-access-krsqz\") pod \"052deadf-ed22-4688-a6fe-0b1039308499\" (UID: \"052deadf-ed22-4688-a6fe-0b1039308499\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.818655 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrsz2\" (UniqueName: \"kubernetes.io/projected/8897808a-0b5d-4d3b-b512-652b62458b9e-kube-api-access-qrsz2\") pod \"8897808a-0b5d-4d3b-b512-652b62458b9e\" (UID: \"8897808a-0b5d-4d3b-b512-652b62458b9e\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.819284 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8897808a-0b5d-4d3b-b512-652b62458b9e-utilities\") pod \"8897808a-0b5d-4d3b-b512-652b62458b9e\" (UID: \"8897808a-0b5d-4d3b-b512-652b62458b9e\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.818851 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/052deadf-ed22-4688-a6fe-0b1039308499-utilities" (OuterVolumeSpecName: "utilities") pod "052deadf-ed22-4688-a6fe-0b1039308499" (UID: "052deadf-ed22-4688-a6fe-0b1039308499"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.819455 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8897808a-0b5d-4d3b-b512-652b62458b9e-catalog-content\") pod \"8897808a-0b5d-4d3b-b512-652b62458b9e\" (UID: \"8897808a-0b5d-4d3b-b512-652b62458b9e\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.819703 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks7mb\" (UniqueName: \"kubernetes.io/projected/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-kube-api-access-ks7mb\") pod \"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24\" (UID: \"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.819822 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-utilities\") pod \"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24\" (UID: \"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24\") " Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.820877 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-utilities" (OuterVolumeSpecName: "utilities") pod "518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" (UID: "518378d4-cd4d-40dd-bf66-3ba3e0dc9e24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.821340 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.821363 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/052deadf-ed22-4688-a6fe-0b1039308499-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.821337 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8897808a-0b5d-4d3b-b512-652b62458b9e-utilities" (OuterVolumeSpecName: "utilities") pod "8897808a-0b5d-4d3b-b512-652b62458b9e" (UID: "8897808a-0b5d-4d3b-b512-652b62458b9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.822157 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8897808a-0b5d-4d3b-b512-652b62458b9e-kube-api-access-qrsz2" (OuterVolumeSpecName: "kube-api-access-qrsz2") pod "8897808a-0b5d-4d3b-b512-652b62458b9e" (UID: "8897808a-0b5d-4d3b-b512-652b62458b9e"). InnerVolumeSpecName "kube-api-access-qrsz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.822324 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/052deadf-ed22-4688-a6fe-0b1039308499-kube-api-access-krsqz" (OuterVolumeSpecName: "kube-api-access-krsqz") pod "052deadf-ed22-4688-a6fe-0b1039308499" (UID: "052deadf-ed22-4688-a6fe-0b1039308499"). InnerVolumeSpecName "kube-api-access-krsqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.824443 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-kube-api-access-ks7mb" (OuterVolumeSpecName: "kube-api-access-ks7mb") pod "518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" (UID: "518378d4-cd4d-40dd-bf66-3ba3e0dc9e24"). InnerVolumeSpecName "kube-api-access-ks7mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.850147 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8897808a-0b5d-4d3b-b512-652b62458b9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8897808a-0b5d-4d3b-b512-652b62458b9e" (UID: "8897808a-0b5d-4d3b-b512-652b62458b9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.878096 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/052deadf-ed22-4688-a6fe-0b1039308499-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "052deadf-ed22-4688-a6fe-0b1039308499" (UID: "052deadf-ed22-4688-a6fe-0b1039308499"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.922895 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/052deadf-ed22-4688-a6fe-0b1039308499-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.922932 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrsz2\" (UniqueName: \"kubernetes.io/projected/8897808a-0b5d-4d3b-b512-652b62458b9e-kube-api-access-qrsz2\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.922944 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krsqz\" (UniqueName: \"kubernetes.io/projected/052deadf-ed22-4688-a6fe-0b1039308499-kube-api-access-krsqz\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.922953 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8897808a-0b5d-4d3b-b512-652b62458b9e-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.922971 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8897808a-0b5d-4d3b-b512-652b62458b9e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.922979 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks7mb\" (UniqueName: \"kubernetes.io/projected/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-kube-api-access-ks7mb\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:26 crc kubenswrapper[4743]: I1125 16:02:26.927283 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" (UID: "518378d4-cd4d-40dd-bf66-3ba3e0dc9e24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.024257 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.461077 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vzbnf" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.461074 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzbnf" event={"ID":"8897808a-0b5d-4d3b-b512-652b62458b9e","Type":"ContainerDied","Data":"ee78891ed724359cbc9dc8342461fc90d8458afcf94846627e46087a1f42d2d5"} Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.461691 4743 scope.go:117] "RemoveContainer" containerID="8a3b9ef2f82bad6f70623d3b7b0841cabf4453d54e80bb811da4baeb8b5866cb" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.464181 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djxzs" event={"ID":"518378d4-cd4d-40dd-bf66-3ba3e0dc9e24","Type":"ContainerDied","Data":"770b58829b67dbf4ba0f78b65dd940604a8e0b446b80ac44661a714959318a3c"} Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.464230 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djxzs" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.466471 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46wjh" event={"ID":"052deadf-ed22-4688-a6fe-0b1039308499","Type":"ContainerDied","Data":"bcf8998037a328539240ed69571ff4f9022a7f6d1e6e7c6dbdb026847e3a4422"} Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.466556 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46wjh" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.469328 4743 generic.go:334] "Generic (PLEG): container finished" podID="11a83874-5ff8-4255-b12b-d25ee3e9f5f4" containerID="17fb8bffe366bda42499c00898393e97dbcae82282aa7a71c88c6480086d0aca" exitCode=0 Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.469391 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rfdr" event={"ID":"11a83874-5ff8-4255-b12b-d25ee3e9f5f4","Type":"ContainerDied","Data":"17fb8bffe366bda42499c00898393e97dbcae82282aa7a71c88c6480086d0aca"} Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.473784 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-r4ckx" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.475673 4743 scope.go:117] "RemoveContainer" containerID="7db7089c06e955fec1f8c6ee8416a1890c2d7f0602ec7b6b1aafbf1ec4a48333" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.497043 4743 scope.go:117] "RemoveContainer" containerID="f9a9798d425fe5e74513e4ea933986f7e8952cdf06c6f7abe5dbf2a0de8fb35d" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.518701 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vzbnf"] Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.522431 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vzbnf"] Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.522650 4743 scope.go:117] "RemoveContainer" containerID="5a80d39460535245ce9d480867f2ac6c6612138f9a1a90d575865315717e32aa" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.523802 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-djxzs"] Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.526898 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-djxzs"] Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.533979 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-46wjh"] Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.535940 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-46wjh"] Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.544364 4743 scope.go:117] "RemoveContainer" containerID="34645ddc3a2c0f27af4b13786b8fdc20509648d547692b30627846198603d6c1" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.558822 4743 scope.go:117] "RemoveContainer" containerID="e4db81339c378fa88b4d50a026cd27e088d6cfd4c614ec1f886ba1137b91892f" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.572865 4743 scope.go:117] "RemoveContainer" containerID="e0fc2e99cfce81547f7aa4024eeac93e98af8679ecad4fd1fb8a920737352989" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.585008 4743 scope.go:117] "RemoveContainer" containerID="a3918d7b98c7f735047f2d566fbc08040cf9d9f506834a9612aaaa0cbe365e2b" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.597809 4743 scope.go:117] "RemoveContainer" containerID="ad900e0a9517100fbe2dc14af056cef1570ee013a4471e542d0e3e538ddfa30d" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.786582 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="052deadf-ed22-4688-a6fe-0b1039308499" path="/var/lib/kubelet/pods/052deadf-ed22-4688-a6fe-0b1039308499/volumes" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.787377 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" path="/var/lib/kubelet/pods/518378d4-cd4d-40dd-bf66-3ba3e0dc9e24/volumes" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.788151 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8897808a-0b5d-4d3b-b512-652b62458b9e" path="/var/lib/kubelet/pods/8897808a-0b5d-4d3b-b512-652b62458b9e/volumes" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.789449 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3cb9b00-75c4-4da6-b6f6-8b92726febae" path="/var/lib/kubelet/pods/a3cb9b00-75c4-4da6-b6f6-8b92726febae/volumes" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.790662 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" path="/var/lib/kubelet/pods/ac1b537d-a3a6-4b63-9fa4-d0b15088c3df/volumes" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.792198 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baeafcba-9592-4136-9893-4bf3b9295041" path="/var/lib/kubelet/pods/baeafcba-9592-4136-9893-4bf3b9295041/volumes" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.855119 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.941882 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-247rj\" (UniqueName: \"kubernetes.io/projected/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-kube-api-access-247rj\") pod \"11a83874-5ff8-4255-b12b-d25ee3e9f5f4\" (UID: \"11a83874-5ff8-4255-b12b-d25ee3e9f5f4\") " Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.942005 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-utilities\") pod \"11a83874-5ff8-4255-b12b-d25ee3e9f5f4\" (UID: \"11a83874-5ff8-4255-b12b-d25ee3e9f5f4\") " Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.942062 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-catalog-content\") pod \"11a83874-5ff8-4255-b12b-d25ee3e9f5f4\" (UID: \"11a83874-5ff8-4255-b12b-d25ee3e9f5f4\") " Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.942863 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-utilities" (OuterVolumeSpecName: "utilities") pod "11a83874-5ff8-4255-b12b-d25ee3e9f5f4" (UID: "11a83874-5ff8-4255-b12b-d25ee3e9f5f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:27 crc kubenswrapper[4743]: I1125 16:02:27.946622 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-kube-api-access-247rj" (OuterVolumeSpecName: "kube-api-access-247rj") pod "11a83874-5ff8-4255-b12b-d25ee3e9f5f4" (UID: "11a83874-5ff8-4255-b12b-d25ee3e9f5f4"). InnerVolumeSpecName "kube-api-access-247rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.035087 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11a83874-5ff8-4255-b12b-d25ee3e9f5f4" (UID: "11a83874-5ff8-4255-b12b-d25ee3e9f5f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.043101 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.043131 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.043142 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-247rj\" (UniqueName: \"kubernetes.io/projected/11a83874-5ff8-4255-b12b-d25ee3e9f5f4-kube-api-access-247rj\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.479619 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rfdr" event={"ID":"11a83874-5ff8-4255-b12b-d25ee3e9f5f4","Type":"ContainerDied","Data":"e9333a331cd760bdc6fe740c4cfe33112c7d0eaafb5f1db5783b00422731195e"} Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.479662 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rfdr" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.479935 4743 scope.go:117] "RemoveContainer" containerID="17fb8bffe366bda42499c00898393e97dbcae82282aa7a71c88c6480086d0aca" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.495883 4743 scope.go:117] "RemoveContainer" containerID="4eafcf54262a2945befe2d7a6c8061ad0dd96e35110307bf54f7ba847a959412" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.502214 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7rfdr"] Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.507825 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7rfdr"] Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.525300 4743 scope.go:117] "RemoveContainer" containerID="2a86f81a16b515a48a788dbb7670e9a6a4216925067980f0da265a2898bd6548" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.894330 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hsk2l"] Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.894921 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baeafcba-9592-4136-9893-4bf3b9295041" containerName="marketplace-operator" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.895034 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="baeafcba-9592-4136-9893-4bf3b9295041" containerName="marketplace-operator" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.895118 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" containerName="extract-utilities" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.895198 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" containerName="extract-utilities" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.895280 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.895360 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.895443 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="052deadf-ed22-4688-a6fe-0b1039308499" containerName="extract-content" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.895522 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="052deadf-ed22-4688-a6fe-0b1039308499" containerName="extract-content" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.895616 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="052deadf-ed22-4688-a6fe-0b1039308499" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.895698 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="052deadf-ed22-4688-a6fe-0b1039308499" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.895788 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8897808a-0b5d-4d3b-b512-652b62458b9e" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.895868 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8897808a-0b5d-4d3b-b512-652b62458b9e" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.895942 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cb9b00-75c4-4da6-b6f6-8b92726febae" containerName="extract-content" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.896016 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cb9b00-75c4-4da6-b6f6-8b92726febae" containerName="extract-content" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.896098 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a83874-5ff8-4255-b12b-d25ee3e9f5f4" containerName="extract-content" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.896164 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a83874-5ff8-4255-b12b-d25ee3e9f5f4" containerName="extract-content" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.896234 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" containerName="extract-utilities" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.896306 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" containerName="extract-utilities" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.896380 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cb9b00-75c4-4da6-b6f6-8b92726febae" containerName="extract-utilities" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.896452 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cb9b00-75c4-4da6-b6f6-8b92726febae" containerName="extract-utilities" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.896527 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a83874-5ff8-4255-b12b-d25ee3e9f5f4" containerName="extract-utilities" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.896617 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a83874-5ff8-4255-b12b-d25ee3e9f5f4" containerName="extract-utilities" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.896698 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="052deadf-ed22-4688-a6fe-0b1039308499" containerName="extract-utilities" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.896766 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="052deadf-ed22-4688-a6fe-0b1039308499" containerName="extract-utilities" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.896843 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" containerName="extract-content" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.896914 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" containerName="extract-content" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.896995 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a83874-5ff8-4255-b12b-d25ee3e9f5f4" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.897067 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a83874-5ff8-4255-b12b-d25ee3e9f5f4" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.897136 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8897808a-0b5d-4d3b-b512-652b62458b9e" containerName="extract-utilities" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.897204 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8897808a-0b5d-4d3b-b512-652b62458b9e" containerName="extract-utilities" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.897275 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" containerName="extract-content" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.897333 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" containerName="extract-content" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.897421 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.897489 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.897572 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cb9b00-75c4-4da6-b6f6-8b92726febae" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.897664 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cb9b00-75c4-4da6-b6f6-8b92726febae" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: E1125 16:02:28.897743 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8897808a-0b5d-4d3b-b512-652b62458b9e" containerName="extract-content" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.897833 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8897808a-0b5d-4d3b-b512-652b62458b9e" containerName="extract-content" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.898046 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8897808a-0b5d-4d3b-b512-652b62458b9e" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.898129 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a83874-5ff8-4255-b12b-d25ee3e9f5f4" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.898188 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="baeafcba-9592-4136-9893-4bf3b9295041" containerName="marketplace-operator" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.898241 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="518378d4-cd4d-40dd-bf66-3ba3e0dc9e24" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.898314 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3cb9b00-75c4-4da6-b6f6-8b92726febae" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.898384 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="052deadf-ed22-4688-a6fe-0b1039308499" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.898450 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac1b537d-a3a6-4b63-9fa4-d0b15088c3df" containerName="registry-server" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.899405 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hsk2l" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.902129 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.908162 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hsk2l"] Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.954730 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bf56610-e316-490a-b030-094e92f0f76d-catalog-content\") pod \"community-operators-hsk2l\" (UID: \"5bf56610-e316-490a-b030-094e92f0f76d\") " pod="openshift-marketplace/community-operators-hsk2l" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.954973 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g25v\" (UniqueName: \"kubernetes.io/projected/5bf56610-e316-490a-b030-094e92f0f76d-kube-api-access-5g25v\") pod \"community-operators-hsk2l\" (UID: \"5bf56610-e316-490a-b030-094e92f0f76d\") " pod="openshift-marketplace/community-operators-hsk2l" Nov 25 16:02:28 crc kubenswrapper[4743]: I1125 16:02:28.955070 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bf56610-e316-490a-b030-094e92f0f76d-utilities\") pod \"community-operators-hsk2l\" (UID: \"5bf56610-e316-490a-b030-094e92f0f76d\") " pod="openshift-marketplace/community-operators-hsk2l" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.058575 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bf56610-e316-490a-b030-094e92f0f76d-catalog-content\") pod \"community-operators-hsk2l\" (UID: \"5bf56610-e316-490a-b030-094e92f0f76d\") " pod="openshift-marketplace/community-operators-hsk2l" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.058640 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g25v\" (UniqueName: \"kubernetes.io/projected/5bf56610-e316-490a-b030-094e92f0f76d-kube-api-access-5g25v\") pod \"community-operators-hsk2l\" (UID: \"5bf56610-e316-490a-b030-094e92f0f76d\") " pod="openshift-marketplace/community-operators-hsk2l" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.058675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bf56610-e316-490a-b030-094e92f0f76d-utilities\") pod \"community-operators-hsk2l\" (UID: \"5bf56610-e316-490a-b030-094e92f0f76d\") " pod="openshift-marketplace/community-operators-hsk2l" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.059182 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bf56610-e316-490a-b030-094e92f0f76d-utilities\") pod \"community-operators-hsk2l\" (UID: \"5bf56610-e316-490a-b030-094e92f0f76d\") " pod="openshift-marketplace/community-operators-hsk2l" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.059477 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bf56610-e316-490a-b030-094e92f0f76d-catalog-content\") pod \"community-operators-hsk2l\" (UID: \"5bf56610-e316-490a-b030-094e92f0f76d\") " pod="openshift-marketplace/community-operators-hsk2l" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.087192 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g25v\" (UniqueName: \"kubernetes.io/projected/5bf56610-e316-490a-b030-094e92f0f76d-kube-api-access-5g25v\") pod \"community-operators-hsk2l\" (UID: \"5bf56610-e316-490a-b030-094e92f0f76d\") " pod="openshift-marketplace/community-operators-hsk2l" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.223135 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hsk2l" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.491748 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kbcvk"] Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.493738 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kbcvk" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.497914 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.498173 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kbcvk"] Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.565368 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9jml\" (UniqueName: \"kubernetes.io/projected/033a7590-8333-4c20-8f6a-71c2f7410c3f-kube-api-access-s9jml\") pod \"redhat-marketplace-kbcvk\" (UID: \"033a7590-8333-4c20-8f6a-71c2f7410c3f\") " pod="openshift-marketplace/redhat-marketplace-kbcvk" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.565431 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033a7590-8333-4c20-8f6a-71c2f7410c3f-utilities\") pod \"redhat-marketplace-kbcvk\" (UID: \"033a7590-8333-4c20-8f6a-71c2f7410c3f\") " pod="openshift-marketplace/redhat-marketplace-kbcvk" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.565639 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033a7590-8333-4c20-8f6a-71c2f7410c3f-catalog-content\") pod \"redhat-marketplace-kbcvk\" (UID: \"033a7590-8333-4c20-8f6a-71c2f7410c3f\") " pod="openshift-marketplace/redhat-marketplace-kbcvk" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.600682 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hsk2l"] Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.667036 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9jml\" (UniqueName: \"kubernetes.io/projected/033a7590-8333-4c20-8f6a-71c2f7410c3f-kube-api-access-s9jml\") pod \"redhat-marketplace-kbcvk\" (UID: \"033a7590-8333-4c20-8f6a-71c2f7410c3f\") " pod="openshift-marketplace/redhat-marketplace-kbcvk" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.667148 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033a7590-8333-4c20-8f6a-71c2f7410c3f-utilities\") pod \"redhat-marketplace-kbcvk\" (UID: \"033a7590-8333-4c20-8f6a-71c2f7410c3f\") " pod="openshift-marketplace/redhat-marketplace-kbcvk" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.667197 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033a7590-8333-4c20-8f6a-71c2f7410c3f-catalog-content\") pod \"redhat-marketplace-kbcvk\" (UID: \"033a7590-8333-4c20-8f6a-71c2f7410c3f\") " pod="openshift-marketplace/redhat-marketplace-kbcvk" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.667702 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033a7590-8333-4c20-8f6a-71c2f7410c3f-utilities\") pod \"redhat-marketplace-kbcvk\" (UID: \"033a7590-8333-4c20-8f6a-71c2f7410c3f\") " pod="openshift-marketplace/redhat-marketplace-kbcvk" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.667916 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033a7590-8333-4c20-8f6a-71c2f7410c3f-catalog-content\") pod \"redhat-marketplace-kbcvk\" (UID: \"033a7590-8333-4c20-8f6a-71c2f7410c3f\") " pod="openshift-marketplace/redhat-marketplace-kbcvk" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.687331 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9jml\" (UniqueName: \"kubernetes.io/projected/033a7590-8333-4c20-8f6a-71c2f7410c3f-kube-api-access-s9jml\") pod \"redhat-marketplace-kbcvk\" (UID: \"033a7590-8333-4c20-8f6a-71c2f7410c3f\") " pod="openshift-marketplace/redhat-marketplace-kbcvk" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.784487 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11a83874-5ff8-4255-b12b-d25ee3e9f5f4" path="/var/lib/kubelet/pods/11a83874-5ff8-4255-b12b-d25ee3e9f5f4/volumes" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.809699 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kbcvk" Nov 25 16:02:29 crc kubenswrapper[4743]: I1125 16:02:29.980222 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kbcvk"] Nov 25 16:02:29 crc kubenswrapper[4743]: W1125 16:02:29.984821 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod033a7590_8333_4c20_8f6a_71c2f7410c3f.slice/crio-1300b6a74c9e5b80a397cc78886806a2c29f58a96f474453aa1d20c7fc174866 WatchSource:0}: Error finding container 1300b6a74c9e5b80a397cc78886806a2c29f58a96f474453aa1d20c7fc174866: Status 404 returned error can't find the container with id 1300b6a74c9e5b80a397cc78886806a2c29f58a96f474453aa1d20c7fc174866 Nov 25 16:02:30 crc kubenswrapper[4743]: I1125 16:02:30.495224 4743 generic.go:334] "Generic (PLEG): container finished" podID="033a7590-8333-4c20-8f6a-71c2f7410c3f" containerID="e2e0778fa8554dc85e0716dd78cd3995fcdfdc7b7ea067ac98c087155c91869c" exitCode=0 Nov 25 16:02:30 crc kubenswrapper[4743]: I1125 16:02:30.495304 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kbcvk" event={"ID":"033a7590-8333-4c20-8f6a-71c2f7410c3f","Type":"ContainerDied","Data":"e2e0778fa8554dc85e0716dd78cd3995fcdfdc7b7ea067ac98c087155c91869c"} Nov 25 16:02:30 crc kubenswrapper[4743]: I1125 16:02:30.495701 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kbcvk" event={"ID":"033a7590-8333-4c20-8f6a-71c2f7410c3f","Type":"ContainerStarted","Data":"1300b6a74c9e5b80a397cc78886806a2c29f58a96f474453aa1d20c7fc174866"} Nov 25 16:02:30 crc kubenswrapper[4743]: I1125 16:02:30.498099 4743 generic.go:334] "Generic (PLEG): container finished" podID="5bf56610-e316-490a-b030-094e92f0f76d" containerID="449c8bef2537b581e5feac4f43e83961b3446c553432aedc18bd8f5463774d13" exitCode=0 Nov 25 16:02:30 crc kubenswrapper[4743]: I1125 16:02:30.498140 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsk2l" event={"ID":"5bf56610-e316-490a-b030-094e92f0f76d","Type":"ContainerDied","Data":"449c8bef2537b581e5feac4f43e83961b3446c553432aedc18bd8f5463774d13"} Nov 25 16:02:30 crc kubenswrapper[4743]: I1125 16:02:30.498167 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsk2l" event={"ID":"5bf56610-e316-490a-b030-094e92f0f76d","Type":"ContainerStarted","Data":"e90c4a4d9aa08887029fc37ca27321b9fe26bd2371739cba86f010ecce119b32"} Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.287658 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9vz76"] Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.288952 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vz76" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.290634 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.304230 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vz76"] Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.391161 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48cfc5c0-943f-4d89-80f1-bc08e3c3a589-catalog-content\") pod \"certified-operators-9vz76\" (UID: \"48cfc5c0-943f-4d89-80f1-bc08e3c3a589\") " pod="openshift-marketplace/certified-operators-9vz76" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.391220 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48cfc5c0-943f-4d89-80f1-bc08e3c3a589-utilities\") pod \"certified-operators-9vz76\" (UID: \"48cfc5c0-943f-4d89-80f1-bc08e3c3a589\") " pod="openshift-marketplace/certified-operators-9vz76" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.391242 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7cnz\" (UniqueName: \"kubernetes.io/projected/48cfc5c0-943f-4d89-80f1-bc08e3c3a589-kube-api-access-j7cnz\") pod \"certified-operators-9vz76\" (UID: \"48cfc5c0-943f-4d89-80f1-bc08e3c3a589\") " pod="openshift-marketplace/certified-operators-9vz76" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.493778 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48cfc5c0-943f-4d89-80f1-bc08e3c3a589-catalog-content\") pod \"certified-operators-9vz76\" (UID: \"48cfc5c0-943f-4d89-80f1-bc08e3c3a589\") " pod="openshift-marketplace/certified-operators-9vz76" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.493957 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48cfc5c0-943f-4d89-80f1-bc08e3c3a589-utilities\") pod \"certified-operators-9vz76\" (UID: \"48cfc5c0-943f-4d89-80f1-bc08e3c3a589\") " pod="openshift-marketplace/certified-operators-9vz76" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.494006 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7cnz\" (UniqueName: \"kubernetes.io/projected/48cfc5c0-943f-4d89-80f1-bc08e3c3a589-kube-api-access-j7cnz\") pod \"certified-operators-9vz76\" (UID: \"48cfc5c0-943f-4d89-80f1-bc08e3c3a589\") " pod="openshift-marketplace/certified-operators-9vz76" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.494221 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48cfc5c0-943f-4d89-80f1-bc08e3c3a589-catalog-content\") pod \"certified-operators-9vz76\" (UID: \"48cfc5c0-943f-4d89-80f1-bc08e3c3a589\") " pod="openshift-marketplace/certified-operators-9vz76" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.494296 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48cfc5c0-943f-4d89-80f1-bc08e3c3a589-utilities\") pod \"certified-operators-9vz76\" (UID: \"48cfc5c0-943f-4d89-80f1-bc08e3c3a589\") " pod="openshift-marketplace/certified-operators-9vz76" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.527536 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7cnz\" (UniqueName: \"kubernetes.io/projected/48cfc5c0-943f-4d89-80f1-bc08e3c3a589-kube-api-access-j7cnz\") pod \"certified-operators-9vz76\" (UID: \"48cfc5c0-943f-4d89-80f1-bc08e3c3a589\") " pod="openshift-marketplace/certified-operators-9vz76" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.616254 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vz76" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.891369 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gmt44"] Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.892758 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gmt44" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.894285 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.902875 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gmt44"] Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.999521 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sntmj\" (UniqueName: \"kubernetes.io/projected/f01f0e90-72f1-4251-b010-4f32a5ba0741-kube-api-access-sntmj\") pod \"redhat-operators-gmt44\" (UID: \"f01f0e90-72f1-4251-b010-4f32a5ba0741\") " pod="openshift-marketplace/redhat-operators-gmt44" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.999580 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01f0e90-72f1-4251-b010-4f32a5ba0741-catalog-content\") pod \"redhat-operators-gmt44\" (UID: \"f01f0e90-72f1-4251-b010-4f32a5ba0741\") " pod="openshift-marketplace/redhat-operators-gmt44" Nov 25 16:02:31 crc kubenswrapper[4743]: I1125 16:02:31.999630 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01f0e90-72f1-4251-b010-4f32a5ba0741-utilities\") pod \"redhat-operators-gmt44\" (UID: \"f01f0e90-72f1-4251-b010-4f32a5ba0741\") " pod="openshift-marketplace/redhat-operators-gmt44" Nov 25 16:02:32 crc kubenswrapper[4743]: I1125 16:02:32.022639 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vz76"] Nov 25 16:02:32 crc kubenswrapper[4743]: W1125 16:02:32.030749 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48cfc5c0_943f_4d89_80f1_bc08e3c3a589.slice/crio-e1fa7f4e7b8e1f263c1c2cd0d8a23063811adc4b2afdf646e6faa1af2bfc7b08 WatchSource:0}: Error finding container e1fa7f4e7b8e1f263c1c2cd0d8a23063811adc4b2afdf646e6faa1af2bfc7b08: Status 404 returned error can't find the container with id e1fa7f4e7b8e1f263c1c2cd0d8a23063811adc4b2afdf646e6faa1af2bfc7b08 Nov 25 16:02:32 crc kubenswrapper[4743]: I1125 16:02:32.100585 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sntmj\" (UniqueName: \"kubernetes.io/projected/f01f0e90-72f1-4251-b010-4f32a5ba0741-kube-api-access-sntmj\") pod \"redhat-operators-gmt44\" (UID: \"f01f0e90-72f1-4251-b010-4f32a5ba0741\") " pod="openshift-marketplace/redhat-operators-gmt44" Nov 25 16:02:32 crc kubenswrapper[4743]: I1125 16:02:32.100647 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01f0e90-72f1-4251-b010-4f32a5ba0741-catalog-content\") pod \"redhat-operators-gmt44\" (UID: \"f01f0e90-72f1-4251-b010-4f32a5ba0741\") " pod="openshift-marketplace/redhat-operators-gmt44" Nov 25 16:02:32 crc kubenswrapper[4743]: I1125 16:02:32.100667 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01f0e90-72f1-4251-b010-4f32a5ba0741-utilities\") pod \"redhat-operators-gmt44\" (UID: \"f01f0e90-72f1-4251-b010-4f32a5ba0741\") " pod="openshift-marketplace/redhat-operators-gmt44" Nov 25 16:02:32 crc kubenswrapper[4743]: I1125 16:02:32.101118 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01f0e90-72f1-4251-b010-4f32a5ba0741-utilities\") pod \"redhat-operators-gmt44\" (UID: \"f01f0e90-72f1-4251-b010-4f32a5ba0741\") " pod="openshift-marketplace/redhat-operators-gmt44" Nov 25 16:02:32 crc kubenswrapper[4743]: I1125 16:02:32.101294 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01f0e90-72f1-4251-b010-4f32a5ba0741-catalog-content\") pod \"redhat-operators-gmt44\" (UID: \"f01f0e90-72f1-4251-b010-4f32a5ba0741\") " pod="openshift-marketplace/redhat-operators-gmt44" Nov 25 16:02:32 crc kubenswrapper[4743]: I1125 16:02:32.121352 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sntmj\" (UniqueName: \"kubernetes.io/projected/f01f0e90-72f1-4251-b010-4f32a5ba0741-kube-api-access-sntmj\") pod \"redhat-operators-gmt44\" (UID: \"f01f0e90-72f1-4251-b010-4f32a5ba0741\") " pod="openshift-marketplace/redhat-operators-gmt44" Nov 25 16:02:32 crc kubenswrapper[4743]: I1125 16:02:32.216860 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gmt44" Nov 25 16:02:32 crc kubenswrapper[4743]: I1125 16:02:32.510107 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vz76" event={"ID":"48cfc5c0-943f-4d89-80f1-bc08e3c3a589","Type":"ContainerStarted","Data":"e1fa7f4e7b8e1f263c1c2cd0d8a23063811adc4b2afdf646e6faa1af2bfc7b08"} Nov 25 16:02:32 crc kubenswrapper[4743]: I1125 16:02:32.602796 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gmt44"] Nov 25 16:02:32 crc kubenswrapper[4743]: W1125 16:02:32.609847 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf01f0e90_72f1_4251_b010_4f32a5ba0741.slice/crio-9cef67cfa49404bf5984640d6d73e1cec436d16fca1b615300182c854c8ae03c WatchSource:0}: Error finding container 9cef67cfa49404bf5984640d6d73e1cec436d16fca1b615300182c854c8ae03c: Status 404 returned error can't find the container with id 9cef67cfa49404bf5984640d6d73e1cec436d16fca1b615300182c854c8ae03c Nov 25 16:02:33 crc kubenswrapper[4743]: I1125 16:02:33.516619 4743 generic.go:334] "Generic (PLEG): container finished" podID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" containerID="16cebe4fdc30e1cc170bb25175ceb89e1598a539b64f600fe5312ce38fe8fcdb" exitCode=0 Nov 25 16:02:33 crc kubenswrapper[4743]: I1125 16:02:33.516780 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vz76" event={"ID":"48cfc5c0-943f-4d89-80f1-bc08e3c3a589","Type":"ContainerDied","Data":"16cebe4fdc30e1cc170bb25175ceb89e1598a539b64f600fe5312ce38fe8fcdb"} Nov 25 16:02:33 crc kubenswrapper[4743]: I1125 16:02:33.518899 4743 generic.go:334] "Generic (PLEG): container finished" podID="f01f0e90-72f1-4251-b010-4f32a5ba0741" containerID="3151e3db97f7ccdcbd5690934cfb59fc84ab571fd55a3e027020e3e6ea9cb1fb" exitCode=0 Nov 25 16:02:33 crc kubenswrapper[4743]: I1125 16:02:33.518942 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmt44" event={"ID":"f01f0e90-72f1-4251-b010-4f32a5ba0741","Type":"ContainerDied","Data":"3151e3db97f7ccdcbd5690934cfb59fc84ab571fd55a3e027020e3e6ea9cb1fb"} Nov 25 16:02:33 crc kubenswrapper[4743]: I1125 16:02:33.518970 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmt44" event={"ID":"f01f0e90-72f1-4251-b010-4f32a5ba0741","Type":"ContainerStarted","Data":"9cef67cfa49404bf5984640d6d73e1cec436d16fca1b615300182c854c8ae03c"} Nov 25 16:02:36 crc kubenswrapper[4743]: I1125 16:02:36.587254 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-98w29" Nov 25 16:02:36 crc kubenswrapper[4743]: I1125 16:02:36.633031 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-td7r9"] Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.926007 4743 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.926874 4743 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.927123 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a" gracePeriod=15 Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.927177 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.927188 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079" gracePeriod=15 Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.927237 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02" gracePeriod=15 Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.927188 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240" gracePeriod=15 Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.927223 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99" gracePeriod=15 Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.929514 4743 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 16:02:37 crc kubenswrapper[4743]: E1125 16:02:37.929777 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.929812 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 25 16:02:37 crc kubenswrapper[4743]: E1125 16:02:37.929824 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.929831 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 16:02:37 crc kubenswrapper[4743]: E1125 16:02:37.929845 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.929851 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 16:02:37 crc kubenswrapper[4743]: E1125 16:02:37.929861 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.929867 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 25 16:02:37 crc kubenswrapper[4743]: E1125 16:02:37.929875 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.929884 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 25 16:02:37 crc kubenswrapper[4743]: E1125 16:02:37.929902 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.929910 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 25 16:02:37 crc kubenswrapper[4743]: E1125 16:02:37.929919 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.929926 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.930043 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.930059 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.930067 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.930076 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.930087 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.930096 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 25 16:02:37 crc kubenswrapper[4743]: I1125 16:02:37.973325 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.076572 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.076661 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.076684 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.077445 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.077540 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.077606 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.077644 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.077687 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.145435 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.145495 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178415 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178477 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178504 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178529 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178554 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178571 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178609 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178615 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178673 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178693 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178715 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178717 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178631 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178751 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178674 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.178734 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.270943 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:02:38 crc kubenswrapper[4743]: W1125 16:02:38.380289 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-c01cc2784eff0b15a22b282875f65efd0f1f88ab7ec9d9f12f755d6e4c56e04b WatchSource:0}: Error finding container c01cc2784eff0b15a22b282875f65efd0f1f88ab7ec9d9f12f755d6e4c56e04b: Status 404 returned error can't find the container with id c01cc2784eff0b15a22b282875f65efd0f1f88ab7ec9d9f12f755d6e4c56e04b Nov 25 16:02:38 crc kubenswrapper[4743]: E1125 16:02:38.383460 4743 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.166:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b4b60d96307c3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-25 16:02:38.381770691 +0000 UTC m=+237.503610240,LastTimestamp:2025-11-25 16:02:38.381770691 +0000 UTC m=+237.503610240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.553550 4743 generic.go:334] "Generic (PLEG): container finished" podID="f01f0e90-72f1-4251-b010-4f32a5ba0741" containerID="3182e2743dda401417af4b3924ae80e9f841dec9ff329345ac701516829a470c" exitCode=0 Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.553660 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmt44" event={"ID":"f01f0e90-72f1-4251-b010-4f32a5ba0741","Type":"ContainerDied","Data":"3182e2743dda401417af4b3924ae80e9f841dec9ff329345ac701516829a470c"} Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.556045 4743 generic.go:334] "Generic (PLEG): container finished" podID="033a7590-8333-4c20-8f6a-71c2f7410c3f" containerID="cd35398e13343d07c9a6384c9356378bb08c01687d18c79af5f03cafff69b846" exitCode=0 Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.556099 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kbcvk" event={"ID":"033a7590-8333-4c20-8f6a-71c2f7410c3f","Type":"ContainerDied","Data":"cd35398e13343d07c9a6384c9356378bb08c01687d18c79af5f03cafff69b846"} Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.557125 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c01cc2784eff0b15a22b282875f65efd0f1f88ab7ec9d9f12f755d6e4c56e04b"} Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.558786 4743 generic.go:334] "Generic (PLEG): container finished" podID="5bf56610-e316-490a-b030-094e92f0f76d" containerID="84b41620e714b690100b42b415a7567433a2250db4a07162a4ba1f87239d0cfd" exitCode=0 Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.558849 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsk2l" event={"ID":"5bf56610-e316-490a-b030-094e92f0f76d","Type":"ContainerDied","Data":"84b41620e714b690100b42b415a7567433a2250db4a07162a4ba1f87239d0cfd"} Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.559719 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.560068 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.560375 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.560690 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.562010 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.562837 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02" exitCode=2 Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.564227 4743 generic.go:334] "Generic (PLEG): container finished" podID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" containerID="cdcbe87402a824de045a9ed8ebf2a6589fc4284f98ce43c66f1989fdaeaaf513" exitCode=0 Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.564260 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vz76" event={"ID":"48cfc5c0-943f-4d89-80f1-bc08e3c3a589","Type":"ContainerDied","Data":"cdcbe87402a824de045a9ed8ebf2a6589fc4284f98ce43c66f1989fdaeaaf513"} Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.565070 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.565570 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.565921 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:38 crc kubenswrapper[4743]: I1125 16:02:38.566174 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.571579 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.573167 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.573884 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240" exitCode=0 Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.573910 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079" exitCode=0 Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.573920 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99" exitCode=0 Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.573976 4743 scope.go:117] "RemoveContainer" containerID="84a3a65d83dc1e965e67573a87e62f0f7b8ebfa3bef5ede9c080457cc7e04df8" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.575703 4743 generic.go:334] "Generic (PLEG): container finished" podID="40d489a4-a435-45ed-b549-aec8103bb098" containerID="8ee6a6213a4ba9f12ba0798d5596754e90db36e6b601375337fef9f01976c6e4" exitCode=0 Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.575795 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"40d489a4-a435-45ed-b549-aec8103bb098","Type":"ContainerDied","Data":"8ee6a6213a4ba9f12ba0798d5596754e90db36e6b601375337fef9f01976c6e4"} Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.576451 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.576718 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.577039 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.577466 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"265d8263ef0ce0d1dd80cc427a8fd32b69d275cc5786bb81b3e488af29b431a9"} Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.577516 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.578077 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.578621 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.578840 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.579098 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.579505 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.579915 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.580260 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hsk2l" event={"ID":"5bf56610-e316-490a-b030-094e92f0f76d","Type":"ContainerStarted","Data":"0745749a269da4ddaef112d7da8fa9a1da964ee7b46d25c364ca99c8394316fd"} Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.580771 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.581099 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.581444 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.581739 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.581986 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.582211 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.582426 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.582809 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.583069 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.583302 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.583504 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.583726 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.583925 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4743]: I1125 16:02:39.584089 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.363139 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.364354 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.364958 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.365246 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.365827 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.366261 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.366533 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.366844 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.367070 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.516471 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.516547 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.516577 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.516611 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.516663 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.516728 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.516833 4743 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.516843 4743 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.516852 4743 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.587930 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.588766 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a" exitCode=0 Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.588864 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.588875 4743 scope.go:117] "RemoveContainer" containerID="2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.591192 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vz76" event={"ID":"48cfc5c0-943f-4d89-80f1-bc08e3c3a589","Type":"ContainerStarted","Data":"d631837a17cd7f610de7309380d839ff0084c79f89fc10cb2e4738278f70fba1"} Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.592536 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.593024 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.593418 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.593647 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.593951 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.594223 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.594535 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.595891 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kbcvk" event={"ID":"033a7590-8333-4c20-8f6a-71c2f7410c3f","Type":"ContainerStarted","Data":"24921c6e811bf3478ecb89a798272bf59f561faf206d346b96b6440651e3c2b1"} Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.596781 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.597165 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.597440 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.597768 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.598906 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.599207 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.599478 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.609913 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.610416 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.610697 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.610931 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.611265 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.611480 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.611727 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.616773 4743 scope.go:117] "RemoveContainer" containerID="35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.652858 4743 scope.go:117] "RemoveContainer" containerID="ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.670762 4743 scope.go:117] "RemoveContainer" containerID="b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.706354 4743 scope.go:117] "RemoveContainer" containerID="7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.725783 4743 scope.go:117] "RemoveContainer" containerID="addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.770762 4743 scope.go:117] "RemoveContainer" containerID="2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240" Nov 25 16:02:40 crc kubenswrapper[4743]: E1125 16:02:40.771710 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\": container with ID starting with 2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240 not found: ID does not exist" containerID="2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.771772 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240"} err="failed to get container status \"2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\": rpc error: code = NotFound desc = could not find container \"2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240\": container with ID starting with 2502068ddbc901a525bc6185605c57e799635e4b2cd3700ab392dd5f0ec55240 not found: ID does not exist" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.771800 4743 scope.go:117] "RemoveContainer" containerID="35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079" Nov 25 16:02:40 crc kubenswrapper[4743]: E1125 16:02:40.772173 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\": container with ID starting with 35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079 not found: ID does not exist" containerID="35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.772201 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079"} err="failed to get container status \"35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\": rpc error: code = NotFound desc = could not find container \"35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079\": container with ID starting with 35881b3e60dd5403fa35d343e8c89727d8967671d6c43efda441b87a304d3079 not found: ID does not exist" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.772223 4743 scope.go:117] "RemoveContainer" containerID="ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99" Nov 25 16:02:40 crc kubenswrapper[4743]: E1125 16:02:40.772464 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\": container with ID starting with ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99 not found: ID does not exist" containerID="ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.772487 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99"} err="failed to get container status \"ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\": rpc error: code = NotFound desc = could not find container \"ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99\": container with ID starting with ab22d495d07c188dcf4d8c57485e78fc2ea36b3694e15c55283695b5aa064b99 not found: ID does not exist" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.772504 4743 scope.go:117] "RemoveContainer" containerID="b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02" Nov 25 16:02:40 crc kubenswrapper[4743]: E1125 16:02:40.778131 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\": container with ID starting with b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02 not found: ID does not exist" containerID="b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.778178 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02"} err="failed to get container status \"b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\": rpc error: code = NotFound desc = could not find container \"b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02\": container with ID starting with b6719043850b6c1749b3f0ac4494b1bc9e8e7eade25fd272160a7272b2b50f02 not found: ID does not exist" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.778210 4743 scope.go:117] "RemoveContainer" containerID="7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a" Nov 25 16:02:40 crc kubenswrapper[4743]: E1125 16:02:40.783701 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\": container with ID starting with 7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a not found: ID does not exist" containerID="7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.783741 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a"} err="failed to get container status \"7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\": rpc error: code = NotFound desc = could not find container \"7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a\": container with ID starting with 7bdf5ec113a79e877e145a74b472f1d9544819d6a896f785d72650e7882db17a not found: ID does not exist" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.783766 4743 scope.go:117] "RemoveContainer" containerID="addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2" Nov 25 16:02:40 crc kubenswrapper[4743]: E1125 16:02:40.784105 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\": container with ID starting with addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2 not found: ID does not exist" containerID="addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.784127 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2"} err="failed to get container status \"addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\": rpc error: code = NotFound desc = could not find container \"addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2\": container with ID starting with addbb2ab893e0457a3248ae010dd211e481ca9a3637c09c3b419853b12ba02b2 not found: ID does not exist" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.874154 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.874645 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.874847 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.875055 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.875305 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.875490 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.875678 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:40 crc kubenswrapper[4743]: I1125 16:02:40.875894 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.021936 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40d489a4-a435-45ed-b549-aec8103bb098-kube-api-access\") pod \"40d489a4-a435-45ed-b549-aec8103bb098\" (UID: \"40d489a4-a435-45ed-b549-aec8103bb098\") " Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.022050 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/40d489a4-a435-45ed-b549-aec8103bb098-var-lock\") pod \"40d489a4-a435-45ed-b549-aec8103bb098\" (UID: \"40d489a4-a435-45ed-b549-aec8103bb098\") " Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.022130 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40d489a4-a435-45ed-b549-aec8103bb098-var-lock" (OuterVolumeSpecName: "var-lock") pod "40d489a4-a435-45ed-b549-aec8103bb098" (UID: "40d489a4-a435-45ed-b549-aec8103bb098"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.022219 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40d489a4-a435-45ed-b549-aec8103bb098-kubelet-dir\") pod \"40d489a4-a435-45ed-b549-aec8103bb098\" (UID: \"40d489a4-a435-45ed-b549-aec8103bb098\") " Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.022284 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40d489a4-a435-45ed-b549-aec8103bb098-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "40d489a4-a435-45ed-b549-aec8103bb098" (UID: "40d489a4-a435-45ed-b549-aec8103bb098"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.022666 4743 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/40d489a4-a435-45ed-b549-aec8103bb098-var-lock\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.022684 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40d489a4-a435-45ed-b549-aec8103bb098-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.029752 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40d489a4-a435-45ed-b549-aec8103bb098-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "40d489a4-a435-45ed-b549-aec8103bb098" (UID: "40d489a4-a435-45ed-b549-aec8103bb098"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.124208 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40d489a4-a435-45ed-b549-aec8103bb098-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:41 crc kubenswrapper[4743]: E1125 16:02:41.267212 4743 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: E1125 16:02:41.267502 4743 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: E1125 16:02:41.267818 4743 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: E1125 16:02:41.268067 4743 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: E1125 16:02:41.268264 4743 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.268291 4743 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 25 16:02:41 crc kubenswrapper[4743]: E1125 16:02:41.268428 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="200ms" Nov 25 16:02:41 crc kubenswrapper[4743]: E1125 16:02:41.468794 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="400ms" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.611447 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"40d489a4-a435-45ed-b549-aec8103bb098","Type":"ContainerDied","Data":"fdf35b3eabc202f2ef48203a11b5743ed707062af4247895d5d89a36c0444fbe"} Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.611757 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdf35b3eabc202f2ef48203a11b5743ed707062af4247895d5d89a36c0444fbe" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.611833 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.617507 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9vz76" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.617710 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9vz76" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.623951 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmt44" event={"ID":"f01f0e90-72f1-4251-b010-4f32a5ba0741","Type":"ContainerStarted","Data":"b53bb37317c41c338163038cecd8272929ba339af4287a8b07a18245fcdd4159"} Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.625459 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.625859 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.629777 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.630920 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.631617 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.631895 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.632156 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.634314 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.634880 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.635114 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.635308 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.635962 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.636231 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.636404 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.667789 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9vz76" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.668895 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.669399 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.669877 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.670176 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.670407 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.670687 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.670948 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.777994 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.779356 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.780315 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.780761 4743 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.781171 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.781490 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.781693 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: I1125 16:02:41.782688 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:41 crc kubenswrapper[4743]: E1125 16:02:41.870144 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="800ms" Nov 25 16:02:42 crc kubenswrapper[4743]: I1125 16:02:42.217942 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gmt44" Nov 25 16:02:42 crc kubenswrapper[4743]: I1125 16:02:42.218011 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gmt44" Nov 25 16:02:42 crc kubenswrapper[4743]: E1125 16:02:42.671428 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="1.6s" Nov 25 16:02:43 crc kubenswrapper[4743]: I1125 16:02:43.250508 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gmt44" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" containerName="registry-server" probeResult="failure" output=< Nov 25 16:02:43 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 25 16:02:43 crc kubenswrapper[4743]: > Nov 25 16:02:44 crc kubenswrapper[4743]: E1125 16:02:44.272292 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="3.2s" Nov 25 16:02:46 crc kubenswrapper[4743]: I1125 16:02:46.707432 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" containerName="oauth-openshift" containerID="cri-o://a80d0c6bb3601fff2633ab439874e613fa6713242d7f507e2e575088498b5876" gracePeriod=15 Nov 25 16:02:46 crc kubenswrapper[4743]: E1125 16:02:46.818073 4743 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.166:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b4b60d96307c3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-25 16:02:38.381770691 +0000 UTC m=+237.503610240,LastTimestamp:2025-11-25 16:02:38.381770691 +0000 UTC m=+237.503610240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.066254 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.066711 4743 status_manager.go:851] "Failed to get status for pod" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-w8pxz\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.067156 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.067566 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.067919 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.068361 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.068637 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.068904 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.197529 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzmr2\" (UniqueName: \"kubernetes.io/projected/b7e20dd3-f239-419d-bc24-5e38d66e7803-kube-api-access-wzmr2\") pod \"b7e20dd3-f239-419d-bc24-5e38d66e7803\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.197629 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-idp-0-file-data\") pod \"b7e20dd3-f239-419d-bc24-5e38d66e7803\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.197686 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-trusted-ca-bundle\") pod \"b7e20dd3-f239-419d-bc24-5e38d66e7803\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.197712 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7e20dd3-f239-419d-bc24-5e38d66e7803-audit-dir\") pod \"b7e20dd3-f239-419d-bc24-5e38d66e7803\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.197747 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-error\") pod \"b7e20dd3-f239-419d-bc24-5e38d66e7803\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.197781 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-ocp-branding-template\") pod \"b7e20dd3-f239-419d-bc24-5e38d66e7803\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.197778 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7e20dd3-f239-419d-bc24-5e38d66e7803-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b7e20dd3-f239-419d-bc24-5e38d66e7803" (UID: "b7e20dd3-f239-419d-bc24-5e38d66e7803"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.197806 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-audit-policies\") pod \"b7e20dd3-f239-419d-bc24-5e38d66e7803\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.197833 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-router-certs\") pod \"b7e20dd3-f239-419d-bc24-5e38d66e7803\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.197889 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-login\") pod \"b7e20dd3-f239-419d-bc24-5e38d66e7803\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.197926 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-cliconfig\") pod \"b7e20dd3-f239-419d-bc24-5e38d66e7803\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.197960 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-service-ca\") pod \"b7e20dd3-f239-419d-bc24-5e38d66e7803\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.197989 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-session\") pod \"b7e20dd3-f239-419d-bc24-5e38d66e7803\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.198016 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-provider-selection\") pod \"b7e20dd3-f239-419d-bc24-5e38d66e7803\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.198058 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-serving-cert\") pod \"b7e20dd3-f239-419d-bc24-5e38d66e7803\" (UID: \"b7e20dd3-f239-419d-bc24-5e38d66e7803\") " Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.198256 4743 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7e20dd3-f239-419d-bc24-5e38d66e7803-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.198545 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "b7e20dd3-f239-419d-bc24-5e38d66e7803" (UID: "b7e20dd3-f239-419d-bc24-5e38d66e7803"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.198553 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "b7e20dd3-f239-419d-bc24-5e38d66e7803" (UID: "b7e20dd3-f239-419d-bc24-5e38d66e7803"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.199026 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "b7e20dd3-f239-419d-bc24-5e38d66e7803" (UID: "b7e20dd3-f239-419d-bc24-5e38d66e7803"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.199755 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "b7e20dd3-f239-419d-bc24-5e38d66e7803" (UID: "b7e20dd3-f239-419d-bc24-5e38d66e7803"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.204714 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7e20dd3-f239-419d-bc24-5e38d66e7803-kube-api-access-wzmr2" (OuterVolumeSpecName: "kube-api-access-wzmr2") pod "b7e20dd3-f239-419d-bc24-5e38d66e7803" (UID: "b7e20dd3-f239-419d-bc24-5e38d66e7803"). InnerVolumeSpecName "kube-api-access-wzmr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.205324 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "b7e20dd3-f239-419d-bc24-5e38d66e7803" (UID: "b7e20dd3-f239-419d-bc24-5e38d66e7803"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.206606 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "b7e20dd3-f239-419d-bc24-5e38d66e7803" (UID: "b7e20dd3-f239-419d-bc24-5e38d66e7803"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.206979 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "b7e20dd3-f239-419d-bc24-5e38d66e7803" (UID: "b7e20dd3-f239-419d-bc24-5e38d66e7803"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.207198 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "b7e20dd3-f239-419d-bc24-5e38d66e7803" (UID: "b7e20dd3-f239-419d-bc24-5e38d66e7803"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.207492 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "b7e20dd3-f239-419d-bc24-5e38d66e7803" (UID: "b7e20dd3-f239-419d-bc24-5e38d66e7803"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.207760 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "b7e20dd3-f239-419d-bc24-5e38d66e7803" (UID: "b7e20dd3-f239-419d-bc24-5e38d66e7803"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.207951 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "b7e20dd3-f239-419d-bc24-5e38d66e7803" (UID: "b7e20dd3-f239-419d-bc24-5e38d66e7803"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.208126 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "b7e20dd3-f239-419d-bc24-5e38d66e7803" (UID: "b7e20dd3-f239-419d-bc24-5e38d66e7803"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.298695 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.298916 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.299018 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.299105 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.299179 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.299241 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.299315 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzmr2\" (UniqueName: \"kubernetes.io/projected/b7e20dd3-f239-419d-bc24-5e38d66e7803-kube-api-access-wzmr2\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.299393 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.299479 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.299564 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.299657 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.299718 4743 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b7e20dd3-f239-419d-bc24-5e38d66e7803-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.299780 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b7e20dd3-f239-419d-bc24-5e38d66e7803-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 25 16:02:47 crc kubenswrapper[4743]: E1125 16:02:47.474051 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.166:6443: connect: connection refused" interval="6.4s" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.662716 4743 generic.go:334] "Generic (PLEG): container finished" podID="b7e20dd3-f239-419d-bc24-5e38d66e7803" containerID="a80d0c6bb3601fff2633ab439874e613fa6713242d7f507e2e575088498b5876" exitCode=0 Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.662761 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" event={"ID":"b7e20dd3-f239-419d-bc24-5e38d66e7803","Type":"ContainerDied","Data":"a80d0c6bb3601fff2633ab439874e613fa6713242d7f507e2e575088498b5876"} Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.662787 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" event={"ID":"b7e20dd3-f239-419d-bc24-5e38d66e7803","Type":"ContainerDied","Data":"425ae9c853d5ab80fd7c0249df4c19af14b2530b615888fa56915c5ec9ce9968"} Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.662803 4743 scope.go:117] "RemoveContainer" containerID="a80d0c6bb3601fff2633ab439874e613fa6713242d7f507e2e575088498b5876" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.662911 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.663759 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.664035 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.664283 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.664513 4743 status_manager.go:851] "Failed to get status for pod" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-w8pxz\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.664813 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.665269 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.665536 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.677851 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.678147 4743 status_manager.go:851] "Failed to get status for pod" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-w8pxz\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.678463 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.678877 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.679140 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.679408 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.679781 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.684496 4743 scope.go:117] "RemoveContainer" containerID="a80d0c6bb3601fff2633ab439874e613fa6713242d7f507e2e575088498b5876" Nov 25 16:02:47 crc kubenswrapper[4743]: E1125 16:02:47.684871 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a80d0c6bb3601fff2633ab439874e613fa6713242d7f507e2e575088498b5876\": container with ID starting with a80d0c6bb3601fff2633ab439874e613fa6713242d7f507e2e575088498b5876 not found: ID does not exist" containerID="a80d0c6bb3601fff2633ab439874e613fa6713242d7f507e2e575088498b5876" Nov 25 16:02:47 crc kubenswrapper[4743]: I1125 16:02:47.684900 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a80d0c6bb3601fff2633ab439874e613fa6713242d7f507e2e575088498b5876"} err="failed to get container status \"a80d0c6bb3601fff2633ab439874e613fa6713242d7f507e2e575088498b5876\": rpc error: code = NotFound desc = could not find container \"a80d0c6bb3601fff2633ab439874e613fa6713242d7f507e2e575088498b5876\": container with ID starting with a80d0c6bb3601fff2633ab439874e613fa6713242d7f507e2e575088498b5876 not found: ID does not exist" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.223731 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hsk2l" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.223841 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hsk2l" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.266522 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hsk2l" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.267186 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.267449 4743 status_manager.go:851] "Failed to get status for pod" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-w8pxz\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.267688 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.267949 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.268289 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.268535 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.268805 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.710978 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hsk2l" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.711667 4743 status_manager.go:851] "Failed to get status for pod" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-w8pxz\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.712144 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.712672 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.713013 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.713610 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.713877 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.714136 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.810872 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kbcvk" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.811257 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kbcvk" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.845532 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kbcvk" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.846181 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.846680 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.847195 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.847527 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.847862 4743 status_manager.go:851] "Failed to get status for pod" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-w8pxz\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.848173 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:49 crc kubenswrapper[4743]: I1125 16:02:49.848421 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.712299 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kbcvk" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.712644 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.712817 4743 status_manager.go:851] "Failed to get status for pod" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-w8pxz\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.712982 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.713155 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.713370 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.713511 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.713862 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.774383 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.775222 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.775567 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.775921 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.776538 4743 status_manager.go:851] "Failed to get status for pod" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-w8pxz\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.776937 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.777285 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.777643 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.788863 4743 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7ccb4d3-6e49-456e-82c1-923ce6c11d69" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.788892 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7ccb4d3-6e49-456e-82c1-923ce6c11d69" Nov 25 16:02:50 crc kubenswrapper[4743]: E1125 16:02:50.789397 4743 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:50 crc kubenswrapper[4743]: I1125 16:02:50.790062 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:50 crc kubenswrapper[4743]: W1125 16:02:50.806431 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-853fd0441bf2b9947956c32cc4604129ce96946392cae6ed905454d3730b78e8 WatchSource:0}: Error finding container 853fd0441bf2b9947956c32cc4604129ce96946392cae6ed905454d3730b78e8: Status 404 returned error can't find the container with id 853fd0441bf2b9947956c32cc4604129ce96946392cae6ed905454d3730b78e8 Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.653680 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9vz76" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.654648 4743 status_manager.go:851] "Failed to get status for pod" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-w8pxz\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.655177 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.655467 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.655727 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.655984 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.656168 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.656315 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.683831 4743 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="8ca5eaa6770520d15bc22c092677fe4dc3419163a0085c41eb2ce0e1294e9dc2" exitCode=0 Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.684478 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"8ca5eaa6770520d15bc22c092677fe4dc3419163a0085c41eb2ce0e1294e9dc2"} Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.684512 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"853fd0441bf2b9947956c32cc4604129ce96946392cae6ed905454d3730b78e8"} Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.684714 4743 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7ccb4d3-6e49-456e-82c1-923ce6c11d69" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.684735 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7ccb4d3-6e49-456e-82c1-923ce6c11d69" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.685228 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: E1125 16:02:51.685356 4743 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.685392 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.685561 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.685835 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.685990 4743 status_manager.go:851] "Failed to get status for pod" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-w8pxz\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.686210 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.686594 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.787371 4743 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.787842 4743 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.788366 4743 status_manager.go:851] "Failed to get status for pod" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" pod="openshift-authentication/oauth-openshift-558db77b4-w8pxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-w8pxz\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.788839 4743 status_manager.go:851] "Failed to get status for pod" podUID="5bf56610-e316-490a-b030-094e92f0f76d" pod="openshift-marketplace/community-operators-hsk2l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-hsk2l\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.789377 4743 status_manager.go:851] "Failed to get status for pod" podUID="48cfc5c0-943f-4d89-80f1-bc08e3c3a589" pod="openshift-marketplace/certified-operators-9vz76" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9vz76\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.789925 4743 status_manager.go:851] "Failed to get status for pod" podUID="40d489a4-a435-45ed-b549-aec8103bb098" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.790665 4743 status_manager.go:851] "Failed to get status for pod" podUID="033a7590-8333-4c20-8f6a-71c2f7410c3f" pod="openshift-marketplace/redhat-marketplace-kbcvk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kbcvk\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:51 crc kubenswrapper[4743]: I1125 16:02:51.791147 4743 status_manager.go:851] "Failed to get status for pod" podUID="f01f0e90-72f1-4251-b010-4f32a5ba0741" pod="openshift-marketplace/redhat-operators-gmt44" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gmt44\": dial tcp 38.102.83.166:6443: connect: connection refused" Nov 25 16:02:52 crc kubenswrapper[4743]: I1125 16:02:52.254054 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gmt44" Nov 25 16:02:52 crc kubenswrapper[4743]: I1125 16:02:52.295881 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gmt44" Nov 25 16:02:52 crc kubenswrapper[4743]: I1125 16:02:52.692100 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 25 16:02:52 crc kubenswrapper[4743]: I1125 16:02:52.692404 4743 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b" exitCode=1 Nov 25 16:02:52 crc kubenswrapper[4743]: I1125 16:02:52.692460 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b"} Nov 25 16:02:52 crc kubenswrapper[4743]: I1125 16:02:52.693090 4743 scope.go:117] "RemoveContainer" containerID="8a45e151a797c21b45f561fb3214b11055d3fa18b79d6d00aaa40ee733698b2b" Nov 25 16:02:52 crc kubenswrapper[4743]: I1125 16:02:52.697109 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e79a23b6ad972513e5741e16aeb159a6195bd3bbeac6d126df396c3dcd9ce378"} Nov 25 16:02:52 crc kubenswrapper[4743]: I1125 16:02:52.697155 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"39600d5de7de1760bd6bbc723e07e9fcc0a559a9277703d39bd3373324cf462c"} Nov 25 16:02:52 crc kubenswrapper[4743]: I1125 16:02:52.697171 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"89134e649fe13451f37ce1022aaa0ebbd6958fb5e9f80c71b129e8c78f73cfc3"} Nov 25 16:02:52 crc kubenswrapper[4743]: I1125 16:02:52.697186 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9e9cd76fa0936c811a267e9afb5c926e7981795516fd6ad93f88b4033fb52c00"} Nov 25 16:02:52 crc kubenswrapper[4743]: I1125 16:02:52.697197 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b6f79f411fc8a6e1e371c037b06d7ab0c27f373a9dca2c9832e1a73997ccc203"} Nov 25 16:02:52 crc kubenswrapper[4743]: I1125 16:02:52.697479 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:52 crc kubenswrapper[4743]: I1125 16:02:52.697540 4743 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7ccb4d3-6e49-456e-82c1-923ce6c11d69" Nov 25 16:02:52 crc kubenswrapper[4743]: I1125 16:02:52.697557 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7ccb4d3-6e49-456e-82c1-923ce6c11d69" Nov 25 16:02:53 crc kubenswrapper[4743]: I1125 16:02:53.713475 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 25 16:02:53 crc kubenswrapper[4743]: I1125 16:02:53.714434 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5bcce7bc42f8e9df32a7e4a94cbeca74b9ff6b5c5d1b1fc5cafd3808f05d3065"} Nov 25 16:02:55 crc kubenswrapper[4743]: I1125 16:02:55.724554 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 16:02:55 crc kubenswrapper[4743]: I1125 16:02:55.791027 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:55 crc kubenswrapper[4743]: I1125 16:02:55.791335 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:55 crc kubenswrapper[4743]: I1125 16:02:55.796803 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:55 crc kubenswrapper[4743]: I1125 16:02:55.962966 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 16:02:55 crc kubenswrapper[4743]: I1125 16:02:55.967408 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 16:02:58 crc kubenswrapper[4743]: I1125 16:02:58.284911 4743 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:58 crc kubenswrapper[4743]: I1125 16:02:58.343062 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="09a97059-9125-4313-8cad-7748efe39b0f" Nov 25 16:02:58 crc kubenswrapper[4743]: I1125 16:02:58.737686 4743 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7ccb4d3-6e49-456e-82c1-923ce6c11d69" Nov 25 16:02:58 crc kubenswrapper[4743]: I1125 16:02:58.737722 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7ccb4d3-6e49-456e-82c1-923ce6c11d69" Nov 25 16:02:58 crc kubenswrapper[4743]: I1125 16:02:58.741287 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="09a97059-9125-4313-8cad-7748efe39b0f" Nov 25 16:02:58 crc kubenswrapper[4743]: I1125 16:02:58.742624 4743 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://b6f79f411fc8a6e1e371c037b06d7ab0c27f373a9dca2c9832e1a73997ccc203" Nov 25 16:02:58 crc kubenswrapper[4743]: I1125 16:02:58.742654 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:02:59 crc kubenswrapper[4743]: I1125 16:02:59.742341 4743 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7ccb4d3-6e49-456e-82c1-923ce6c11d69" Nov 25 16:02:59 crc kubenswrapper[4743]: I1125 16:02:59.742400 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c7ccb4d3-6e49-456e-82c1-923ce6c11d69" Nov 25 16:02:59 crc kubenswrapper[4743]: I1125 16:02:59.745719 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="09a97059-9125-4313-8cad-7748efe39b0f" Nov 25 16:03:01 crc kubenswrapper[4743]: I1125 16:03:01.678260 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" podUID="458c2ebd-ea67-4efc-b058-142de4fce612" containerName="registry" containerID="cri-o://3be63306ab05ec1ef8da1e67f3266e85e16630818ac5aab67445b1f638db1498" gracePeriod=30 Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.007433 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.084273 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/458c2ebd-ea67-4efc-b058-142de4fce612-installation-pull-secrets\") pod \"458c2ebd-ea67-4efc-b058-142de4fce612\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.084550 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"458c2ebd-ea67-4efc-b058-142de4fce612\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.084621 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-bound-sa-token\") pod \"458c2ebd-ea67-4efc-b058-142de4fce612\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.084676 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-registry-tls\") pod \"458c2ebd-ea67-4efc-b058-142de4fce612\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.084756 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p4fr\" (UniqueName: \"kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-kube-api-access-9p4fr\") pod \"458c2ebd-ea67-4efc-b058-142de4fce612\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.084793 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/458c2ebd-ea67-4efc-b058-142de4fce612-ca-trust-extracted\") pod \"458c2ebd-ea67-4efc-b058-142de4fce612\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.084833 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/458c2ebd-ea67-4efc-b058-142de4fce612-registry-certificates\") pod \"458c2ebd-ea67-4efc-b058-142de4fce612\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.084884 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/458c2ebd-ea67-4efc-b058-142de4fce612-trusted-ca\") pod \"458c2ebd-ea67-4efc-b058-142de4fce612\" (UID: \"458c2ebd-ea67-4efc-b058-142de4fce612\") " Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.085635 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/458c2ebd-ea67-4efc-b058-142de4fce612-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "458c2ebd-ea67-4efc-b058-142de4fce612" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.085696 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/458c2ebd-ea67-4efc-b058-142de4fce612-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "458c2ebd-ea67-4efc-b058-142de4fce612" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.085871 4743 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/458c2ebd-ea67-4efc-b058-142de4fce612-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.085892 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/458c2ebd-ea67-4efc-b058-142de4fce612-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.090325 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-kube-api-access-9p4fr" (OuterVolumeSpecName: "kube-api-access-9p4fr") pod "458c2ebd-ea67-4efc-b058-142de4fce612" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612"). InnerVolumeSpecName "kube-api-access-9p4fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.090387 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "458c2ebd-ea67-4efc-b058-142de4fce612" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.090459 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458c2ebd-ea67-4efc-b058-142de4fce612-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "458c2ebd-ea67-4efc-b058-142de4fce612" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.092768 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "458c2ebd-ea67-4efc-b058-142de4fce612" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.093459 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "458c2ebd-ea67-4efc-b058-142de4fce612" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.102106 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/458c2ebd-ea67-4efc-b058-142de4fce612-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "458c2ebd-ea67-4efc-b058-142de4fce612" (UID: "458c2ebd-ea67-4efc-b058-142de4fce612"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.186718 4743 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/458c2ebd-ea67-4efc-b058-142de4fce612-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.186767 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.186779 4743 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.186789 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p4fr\" (UniqueName: \"kubernetes.io/projected/458c2ebd-ea67-4efc-b058-142de4fce612-kube-api-access-9p4fr\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.186800 4743 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/458c2ebd-ea67-4efc-b058-142de4fce612-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.758639 4743 generic.go:334] "Generic (PLEG): container finished" podID="458c2ebd-ea67-4efc-b058-142de4fce612" containerID="3be63306ab05ec1ef8da1e67f3266e85e16630818ac5aab67445b1f638db1498" exitCode=0 Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.758686 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.758683 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" event={"ID":"458c2ebd-ea67-4efc-b058-142de4fce612","Type":"ContainerDied","Data":"3be63306ab05ec1ef8da1e67f3266e85e16630818ac5aab67445b1f638db1498"} Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.758818 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-td7r9" event={"ID":"458c2ebd-ea67-4efc-b058-142de4fce612","Type":"ContainerDied","Data":"54e17b5f5395fc265fb5c93d9dbc767801234a3f5e596fbdff72a0f75963d82c"} Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.758836 4743 scope.go:117] "RemoveContainer" containerID="3be63306ab05ec1ef8da1e67f3266e85e16630818ac5aab67445b1f638db1498" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.775449 4743 scope.go:117] "RemoveContainer" containerID="3be63306ab05ec1ef8da1e67f3266e85e16630818ac5aab67445b1f638db1498" Nov 25 16:03:02 crc kubenswrapper[4743]: E1125 16:03:02.775866 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3be63306ab05ec1ef8da1e67f3266e85e16630818ac5aab67445b1f638db1498\": container with ID starting with 3be63306ab05ec1ef8da1e67f3266e85e16630818ac5aab67445b1f638db1498 not found: ID does not exist" containerID="3be63306ab05ec1ef8da1e67f3266e85e16630818ac5aab67445b1f638db1498" Nov 25 16:03:02 crc kubenswrapper[4743]: I1125 16:03:02.775922 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be63306ab05ec1ef8da1e67f3266e85e16630818ac5aab67445b1f638db1498"} err="failed to get container status \"3be63306ab05ec1ef8da1e67f3266e85e16630818ac5aab67445b1f638db1498\": rpc error: code = NotFound desc = could not find container \"3be63306ab05ec1ef8da1e67f3266e85e16630818ac5aab67445b1f638db1498\": container with ID starting with 3be63306ab05ec1ef8da1e67f3266e85e16630818ac5aab67445b1f638db1498 not found: ID does not exist" Nov 25 16:03:05 crc kubenswrapper[4743]: I1125 16:03:05.728170 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 16:03:07 crc kubenswrapper[4743]: I1125 16:03:07.556038 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 25 16:03:09 crc kubenswrapper[4743]: I1125 16:03:09.270405 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 25 16:03:09 crc kubenswrapper[4743]: I1125 16:03:09.551427 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 25 16:03:09 crc kubenswrapper[4743]: I1125 16:03:09.741485 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 25 16:03:10 crc kubenswrapper[4743]: I1125 16:03:10.007696 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 25 16:03:10 crc kubenswrapper[4743]: I1125 16:03:10.036807 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 25 16:03:10 crc kubenswrapper[4743]: I1125 16:03:10.081843 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 25 16:03:10 crc kubenswrapper[4743]: I1125 16:03:10.093670 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 25 16:03:10 crc kubenswrapper[4743]: I1125 16:03:10.159260 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 25 16:03:11 crc kubenswrapper[4743]: I1125 16:03:11.032752 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 25 16:03:11 crc kubenswrapper[4743]: I1125 16:03:11.148798 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 25 16:03:11 crc kubenswrapper[4743]: I1125 16:03:11.241897 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 25 16:03:11 crc kubenswrapper[4743]: I1125 16:03:11.271626 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 25 16:03:11 crc kubenswrapper[4743]: I1125 16:03:11.572057 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 25 16:03:11 crc kubenswrapper[4743]: I1125 16:03:11.604519 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 25 16:03:11 crc kubenswrapper[4743]: I1125 16:03:11.832179 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 25 16:03:11 crc kubenswrapper[4743]: I1125 16:03:11.890545 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 25 16:03:11 crc kubenswrapper[4743]: I1125 16:03:11.918650 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 25 16:03:11 crc kubenswrapper[4743]: I1125 16:03:11.954764 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 25 16:03:11 crc kubenswrapper[4743]: I1125 16:03:11.976934 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 25 16:03:11 crc kubenswrapper[4743]: I1125 16:03:11.977491 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.041185 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.184947 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.204121 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.204321 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.264449 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.360723 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.383321 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.394358 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.411120 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.499312 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.505691 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.603434 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.623694 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.665822 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.670191 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.679704 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.725273 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.742585 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.894762 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 25 16:03:12 crc kubenswrapper[4743]: I1125 16:03:12.999727 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 25 16:03:13 crc kubenswrapper[4743]: I1125 16:03:13.043145 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 25 16:03:13 crc kubenswrapper[4743]: I1125 16:03:13.194736 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 25 16:03:13 crc kubenswrapper[4743]: I1125 16:03:13.211170 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 25 16:03:13 crc kubenswrapper[4743]: I1125 16:03:13.211250 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 25 16:03:13 crc kubenswrapper[4743]: I1125 16:03:13.355406 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 25 16:03:13 crc kubenswrapper[4743]: I1125 16:03:13.495864 4743 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 25 16:03:13 crc kubenswrapper[4743]: I1125 16:03:13.519892 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 25 16:03:13 crc kubenswrapper[4743]: I1125 16:03:13.584466 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 25 16:03:13 crc kubenswrapper[4743]: I1125 16:03:13.663471 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 25 16:03:13 crc kubenswrapper[4743]: I1125 16:03:13.746490 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 25 16:03:13 crc kubenswrapper[4743]: I1125 16:03:13.812823 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 25 16:03:13 crc kubenswrapper[4743]: I1125 16:03:13.925054 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 25 16:03:13 crc kubenswrapper[4743]: I1125 16:03:13.955002 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.109525 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.163257 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.204463 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.280039 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.302049 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.312055 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.321396 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.369860 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.381509 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.499025 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.519248 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.534960 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.571784 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.625208 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.633363 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.635951 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.637438 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.669351 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.695433 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.801437 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.837372 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.838894 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.841791 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.867833 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.978924 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 25 16:03:14 crc kubenswrapper[4743]: I1125 16:03:14.993820 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 25 16:03:15 crc kubenswrapper[4743]: I1125 16:03:15.021812 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 25 16:03:15 crc kubenswrapper[4743]: I1125 16:03:15.038425 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 25 16:03:15 crc kubenswrapper[4743]: I1125 16:03:15.114425 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 25 16:03:15 crc kubenswrapper[4743]: I1125 16:03:15.210424 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 25 16:03:15 crc kubenswrapper[4743]: I1125 16:03:15.308705 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 25 16:03:15 crc kubenswrapper[4743]: I1125 16:03:15.494521 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 25 16:03:15 crc kubenswrapper[4743]: I1125 16:03:15.653253 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 25 16:03:15 crc kubenswrapper[4743]: I1125 16:03:15.683163 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 25 16:03:15 crc kubenswrapper[4743]: I1125 16:03:15.694353 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 25 16:03:15 crc kubenswrapper[4743]: I1125 16:03:15.728119 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 25 16:03:15 crc kubenswrapper[4743]: I1125 16:03:15.743665 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 25 16:03:15 crc kubenswrapper[4743]: I1125 16:03:15.860490 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 25 16:03:15 crc kubenswrapper[4743]: I1125 16:03:15.963504 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 25 16:03:15 crc kubenswrapper[4743]: I1125 16:03:15.993478 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 25 16:03:16 crc kubenswrapper[4743]: I1125 16:03:16.018448 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 25 16:03:16 crc kubenswrapper[4743]: I1125 16:03:16.116198 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 25 16:03:16 crc kubenswrapper[4743]: I1125 16:03:16.231412 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 25 16:03:16 crc kubenswrapper[4743]: I1125 16:03:16.263915 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 25 16:03:16 crc kubenswrapper[4743]: I1125 16:03:16.350786 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 25 16:03:16 crc kubenswrapper[4743]: I1125 16:03:16.360749 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 25 16:03:16 crc kubenswrapper[4743]: I1125 16:03:16.406690 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 25 16:03:16 crc kubenswrapper[4743]: I1125 16:03:16.610966 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 25 16:03:16 crc kubenswrapper[4743]: I1125 16:03:16.614276 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 25 16:03:16 crc kubenswrapper[4743]: I1125 16:03:16.734407 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 16:03:16 crc kubenswrapper[4743]: I1125 16:03:16.845313 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 25 16:03:16 crc kubenswrapper[4743]: I1125 16:03:16.916064 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 25 16:03:16 crc kubenswrapper[4743]: I1125 16:03:16.925035 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.003199 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.016512 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.048721 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.052656 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.148837 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.199071 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.258026 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.288348 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.319508 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.350948 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.356143 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.501049 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.524508 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.556359 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.591731 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.628565 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.640181 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.641783 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.656685 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.727750 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.789385 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.811333 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 25 16:03:17 crc kubenswrapper[4743]: I1125 16:03:17.921239 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 25 16:03:18 crc kubenswrapper[4743]: I1125 16:03:18.150222 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 25 16:03:18 crc kubenswrapper[4743]: I1125 16:03:18.190977 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 25 16:03:18 crc kubenswrapper[4743]: I1125 16:03:18.265339 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 25 16:03:18 crc kubenswrapper[4743]: I1125 16:03:18.370322 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 25 16:03:18 crc kubenswrapper[4743]: I1125 16:03:18.480432 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 25 16:03:18 crc kubenswrapper[4743]: I1125 16:03:18.604811 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 25 16:03:18 crc kubenswrapper[4743]: I1125 16:03:18.643251 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 25 16:03:18 crc kubenswrapper[4743]: I1125 16:03:18.647480 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 25 16:03:18 crc kubenswrapper[4743]: I1125 16:03:18.703577 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 25 16:03:18 crc kubenswrapper[4743]: I1125 16:03:18.704998 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 25 16:03:18 crc kubenswrapper[4743]: I1125 16:03:18.839934 4743 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 25 16:03:18 crc kubenswrapper[4743]: I1125 16:03:18.879026 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.030928 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.067200 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.168340 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.217651 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.286261 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.301443 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.392282 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.432826 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.441784 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.456195 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.473258 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.520162 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.536537 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.540064 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.567895 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.645129 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.645703 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.692107 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.834561 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 25 16:03:19 crc kubenswrapper[4743]: I1125 16:03:19.965160 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.097831 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.099229 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.137348 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.231735 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.473087 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.504331 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.510541 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.559538 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.581602 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.589180 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.624198 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.631845 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.645913 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.684672 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.750685 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.762669 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.973339 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 25 16:03:20 crc kubenswrapper[4743]: I1125 16:03:20.973837 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.067617 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.201926 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.284970 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.308627 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.321950 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.323393 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.371239 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.406074 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.419614 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.483324 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.488284 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.558861 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.574006 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.642955 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.803698 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.852879 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.923693 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.933741 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 25 16:03:21 crc kubenswrapper[4743]: I1125 16:03:21.953380 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.015342 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.090035 4743 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.198911 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.260711 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.300733 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.309342 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.383100 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.426913 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.440463 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.476541 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.560781 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.698292 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.778935 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.795426 4743 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.832414 4743 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.833527 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=45.833506495 podStartE2EDuration="45.833506495s" podCreationTimestamp="2025-11-25 16:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:02:58.338132163 +0000 UTC m=+257.459971712" watchObservedRunningTime="2025-11-25 16:03:22.833506495 +0000 UTC m=+281.955346044" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.834052 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9vz76" podStartSLOduration=45.952100116 podStartE2EDuration="51.834044992s" podCreationTimestamp="2025-11-25 16:02:31 +0000 UTC" firstStartedPulling="2025-11-25 16:02:33.518672394 +0000 UTC m=+232.640511943" lastFinishedPulling="2025-11-25 16:02:39.40061727 +0000 UTC m=+238.522456819" observedRunningTime="2025-11-25 16:02:58.412931288 +0000 UTC m=+257.534770857" watchObservedRunningTime="2025-11-25 16:03:22.834044992 +0000 UTC m=+281.955884541" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.835287 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hsk2l" podStartSLOduration=47.27816319 podStartE2EDuration="54.835282898s" podCreationTimestamp="2025-11-25 16:02:28 +0000 UTC" firstStartedPulling="2025-11-25 16:02:31.506001684 +0000 UTC m=+230.627841233" lastFinishedPulling="2025-11-25 16:02:39.063121392 +0000 UTC m=+238.184960941" observedRunningTime="2025-11-25 16:02:58.382544592 +0000 UTC m=+257.504384141" watchObservedRunningTime="2025-11-25 16:03:22.835282898 +0000 UTC m=+281.957122447" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.835783 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gmt44" podStartSLOduration=44.7815442 podStartE2EDuration="51.835775313s" podCreationTimestamp="2025-11-25 16:02:31 +0000 UTC" firstStartedPulling="2025-11-25 16:02:33.520516422 +0000 UTC m=+232.642355981" lastFinishedPulling="2025-11-25 16:02:40.574747545 +0000 UTC m=+239.696587094" observedRunningTime="2025-11-25 16:02:58.32409125 +0000 UTC m=+257.445930829" watchObservedRunningTime="2025-11-25 16:03:22.835775313 +0000 UTC m=+281.957614862" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.836665 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kbcvk" podStartSLOduration=45.065033942 podStartE2EDuration="53.83665739s" podCreationTimestamp="2025-11-25 16:02:29 +0000 UTC" firstStartedPulling="2025-11-25 16:02:31.50624723 +0000 UTC m=+230.628086779" lastFinishedPulling="2025-11-25 16:02:40.277870678 +0000 UTC m=+239.399710227" observedRunningTime="2025-11-25 16:02:58.307207823 +0000 UTC m=+257.429047372" watchObservedRunningTime="2025-11-25 16:03:22.83665739 +0000 UTC m=+281.958496969" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.837104 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-td7r9","openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-w8pxz"] Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.837156 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.840794 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.862015 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.861979735 podStartE2EDuration="24.861979735s" podCreationTimestamp="2025-11-25 16:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:03:22.852933422 +0000 UTC m=+281.974772991" watchObservedRunningTime="2025-11-25 16:03:22.861979735 +0000 UTC m=+281.983819324" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.943616 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 25 16:03:22 crc kubenswrapper[4743]: I1125 16:03:22.979299 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 25 16:03:23 crc kubenswrapper[4743]: I1125 16:03:23.076750 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 25 16:03:23 crc kubenswrapper[4743]: I1125 16:03:23.154793 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 25 16:03:23 crc kubenswrapper[4743]: I1125 16:03:23.348054 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 25 16:03:23 crc kubenswrapper[4743]: I1125 16:03:23.398742 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 25 16:03:23 crc kubenswrapper[4743]: I1125 16:03:23.408342 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 25 16:03:23 crc kubenswrapper[4743]: I1125 16:03:23.414295 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 25 16:03:23 crc kubenswrapper[4743]: I1125 16:03:23.450950 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 25 16:03:23 crc kubenswrapper[4743]: I1125 16:03:23.499774 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 25 16:03:23 crc kubenswrapper[4743]: I1125 16:03:23.554342 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 25 16:03:23 crc kubenswrapper[4743]: I1125 16:03:23.650333 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 25 16:03:23 crc kubenswrapper[4743]: I1125 16:03:23.782118 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="458c2ebd-ea67-4efc-b058-142de4fce612" path="/var/lib/kubelet/pods/458c2ebd-ea67-4efc-b058-142de4fce612/volumes" Nov 25 16:03:23 crc kubenswrapper[4743]: I1125 16:03:23.782994 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" path="/var/lib/kubelet/pods/b7e20dd3-f239-419d-bc24-5e38d66e7803/volumes" Nov 25 16:03:23 crc kubenswrapper[4743]: I1125 16:03:23.953530 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 25 16:03:24 crc kubenswrapper[4743]: I1125 16:03:24.048708 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 25 16:03:24 crc kubenswrapper[4743]: I1125 16:03:24.200845 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 25 16:03:24 crc kubenswrapper[4743]: I1125 16:03:24.224222 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 25 16:03:24 crc kubenswrapper[4743]: I1125 16:03:24.291544 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 25 16:03:24 crc kubenswrapper[4743]: I1125 16:03:24.359436 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 25 16:03:24 crc kubenswrapper[4743]: I1125 16:03:24.451649 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 25 16:03:24 crc kubenswrapper[4743]: I1125 16:03:24.838469 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 25 16:03:25 crc kubenswrapper[4743]: I1125 16:03:25.247298 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 25 16:03:25 crc kubenswrapper[4743]: I1125 16:03:25.521517 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 25 16:03:26 crc kubenswrapper[4743]: I1125 16:03:26.529249 4743 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.740720 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7949b695f8-xhvwr"] Nov 25 16:03:30 crc kubenswrapper[4743]: E1125 16:03:30.741234 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458c2ebd-ea67-4efc-b058-142de4fce612" containerName="registry" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.741248 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="458c2ebd-ea67-4efc-b058-142de4fce612" containerName="registry" Nov 25 16:03:30 crc kubenswrapper[4743]: E1125 16:03:30.741263 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" containerName="oauth-openshift" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.741269 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" containerName="oauth-openshift" Nov 25 16:03:30 crc kubenswrapper[4743]: E1125 16:03:30.741287 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40d489a4-a435-45ed-b549-aec8103bb098" containerName="installer" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.741299 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="40d489a4-a435-45ed-b549-aec8103bb098" containerName="installer" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.741394 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e20dd3-f239-419d-bc24-5e38d66e7803" containerName="oauth-openshift" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.741403 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="40d489a4-a435-45ed-b549-aec8103bb098" containerName="installer" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.741414 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="458c2ebd-ea67-4efc-b058-142de4fce612" containerName="registry" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.741817 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.744133 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.744167 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.744506 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.744721 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.745494 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.745939 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.746060 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.746644 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.746654 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.747052 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.747192 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.747669 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.754369 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7949b695f8-xhvwr"] Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.754525 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.758505 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.766110 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.810564 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.810853 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-session\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.810948 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.811080 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.811184 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.811270 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.811397 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.811569 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-service-ca\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.811694 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdc18045-fdf1-4305-b854-1da024920610-audit-policies\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.811813 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdc18045-fdf1-4305-b854-1da024920610-audit-dir\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.811923 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-user-template-error\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.812027 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq8fd\" (UniqueName: \"kubernetes.io/projected/fdc18045-fdf1-4305-b854-1da024920610-kube-api-access-rq8fd\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.812121 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-user-template-login\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.812226 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-router-certs\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.913384 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-router-certs\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.913435 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.913455 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-session\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.913473 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.913491 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.913508 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.913533 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.913556 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.913614 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-service-ca\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.913654 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdc18045-fdf1-4305-b854-1da024920610-audit-policies\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.913680 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdc18045-fdf1-4305-b854-1da024920610-audit-dir\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.913706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-user-template-error\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.913727 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq8fd\" (UniqueName: \"kubernetes.io/projected/fdc18045-fdf1-4305-b854-1da024920610-kube-api-access-rq8fd\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.913754 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-user-template-login\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.914370 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdc18045-fdf1-4305-b854-1da024920610-audit-dir\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.915094 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.915302 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.915395 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdc18045-fdf1-4305-b854-1da024920610-audit-policies\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.915471 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-service-ca\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.918787 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.919146 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.919455 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-user-template-login\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.919506 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-user-template-error\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.919517 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-router-certs\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.919996 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.920157 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.920406 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fdc18045-fdf1-4305-b854-1da024920610-v4-0-config-system-session\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:30 crc kubenswrapper[4743]: I1125 16:03:30.932117 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq8fd\" (UniqueName: \"kubernetes.io/projected/fdc18045-fdf1-4305-b854-1da024920610-kube-api-access-rq8fd\") pod \"oauth-openshift-7949b695f8-xhvwr\" (UID: \"fdc18045-fdf1-4305-b854-1da024920610\") " pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:31 crc kubenswrapper[4743]: I1125 16:03:31.057661 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:31 crc kubenswrapper[4743]: I1125 16:03:31.238692 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7949b695f8-xhvwr"] Nov 25 16:03:31 crc kubenswrapper[4743]: I1125 16:03:31.898555 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" event={"ID":"fdc18045-fdf1-4305-b854-1da024920610","Type":"ContainerStarted","Data":"1044aab151bc29d138308d8910a5fa233ed9aa57b206571c5ba8d55388b6a084"} Nov 25 16:03:31 crc kubenswrapper[4743]: I1125 16:03:31.899051 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:31 crc kubenswrapper[4743]: I1125 16:03:31.899091 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" event={"ID":"fdc18045-fdf1-4305-b854-1da024920610","Type":"ContainerStarted","Data":"4c87e034d163f7e839944e7ac177aebcf474279d5388c5623688e7bdbc492e68"} Nov 25 16:03:32 crc kubenswrapper[4743]: I1125 16:03:32.015126 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" Nov 25 16:03:32 crc kubenswrapper[4743]: I1125 16:03:32.043674 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7949b695f8-xhvwr" podStartSLOduration=71.043636185 podStartE2EDuration="1m11.043636185s" podCreationTimestamp="2025-11-25 16:02:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:03:31.928027795 +0000 UTC m=+291.049867384" watchObservedRunningTime="2025-11-25 16:03:32.043636185 +0000 UTC m=+291.165475754" Nov 25 16:03:32 crc kubenswrapper[4743]: I1125 16:03:32.291302 4743 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 25 16:03:32 crc kubenswrapper[4743]: I1125 16:03:32.291515 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://265d8263ef0ce0d1dd80cc427a8fd32b69d275cc5786bb81b3e488af29b431a9" gracePeriod=5 Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.891022 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.891571 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.906253 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.906331 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.906369 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.906422 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.906453 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.906558 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.906630 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.906623 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.906709 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.907125 4743 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.907166 4743 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.907183 4743 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.907195 4743 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.918272 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.937929 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.937994 4743 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="265d8263ef0ce0d1dd80cc427a8fd32b69d275cc5786bb81b3e488af29b431a9" exitCode=137 Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.938055 4743 scope.go:117] "RemoveContainer" containerID="265d8263ef0ce0d1dd80cc427a8fd32b69d275cc5786bb81b3e488af29b431a9" Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.938145 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.958493 4743 scope.go:117] "RemoveContainer" containerID="265d8263ef0ce0d1dd80cc427a8fd32b69d275cc5786bb81b3e488af29b431a9" Nov 25 16:03:37 crc kubenswrapper[4743]: E1125 16:03:37.959241 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"265d8263ef0ce0d1dd80cc427a8fd32b69d275cc5786bb81b3e488af29b431a9\": container with ID starting with 265d8263ef0ce0d1dd80cc427a8fd32b69d275cc5786bb81b3e488af29b431a9 not found: ID does not exist" containerID="265d8263ef0ce0d1dd80cc427a8fd32b69d275cc5786bb81b3e488af29b431a9" Nov 25 16:03:37 crc kubenswrapper[4743]: I1125 16:03:37.959283 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"265d8263ef0ce0d1dd80cc427a8fd32b69d275cc5786bb81b3e488af29b431a9"} err="failed to get container status \"265d8263ef0ce0d1dd80cc427a8fd32b69d275cc5786bb81b3e488af29b431a9\": rpc error: code = NotFound desc = could not find container \"265d8263ef0ce0d1dd80cc427a8fd32b69d275cc5786bb81b3e488af29b431a9\": container with ID starting with 265d8263ef0ce0d1dd80cc427a8fd32b69d275cc5786bb81b3e488af29b431a9 not found: ID does not exist" Nov 25 16:03:38 crc kubenswrapper[4743]: I1125 16:03:38.008827 4743 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:39 crc kubenswrapper[4743]: I1125 16:03:39.786607 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 25 16:03:39 crc kubenswrapper[4743]: I1125 16:03:39.786911 4743 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Nov 25 16:03:39 crc kubenswrapper[4743]: I1125 16:03:39.800135 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 25 16:03:39 crc kubenswrapper[4743]: I1125 16:03:39.800177 4743 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="6aac2fc2-0fda-4cce-ac55-3dbc1fafa559" Nov 25 16:03:39 crc kubenswrapper[4743]: I1125 16:03:39.807217 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 25 16:03:39 crc kubenswrapper[4743]: I1125 16:03:39.807249 4743 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="6aac2fc2-0fda-4cce-ac55-3dbc1fafa559" Nov 25 16:03:46 crc kubenswrapper[4743]: I1125 16:03:46.918852 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5n9pg"] Nov 25 16:03:46 crc kubenswrapper[4743]: I1125 16:03:46.919704 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" podUID="2e800807-1cef-4dcb-9001-48322127beb9" containerName="controller-manager" containerID="cri-o://7f1eb3b73cac8b2c2a6bf1e4cbc4bd707b96ee2d88cd49d2b404034c73ef3e66" gracePeriod=30 Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.049361 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw"] Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.049882 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" podUID="c5593b1c-91e9-48c0-b348-cd0a46f64639" containerName="route-controller-manager" containerID="cri-o://583009a3548beff1624b9b1a3c4244fd7c5c08419eed47ee04c75dbc46d63a2d" gracePeriod=30 Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.325457 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.387417 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.409433 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-85dbdfd575-48mmr"] Nov 25 16:03:47 crc kubenswrapper[4743]: E1125 16:03:47.409950 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e800807-1cef-4dcb-9001-48322127beb9" containerName="controller-manager" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.409976 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e800807-1cef-4dcb-9001-48322127beb9" containerName="controller-manager" Nov 25 16:03:47 crc kubenswrapper[4743]: E1125 16:03:47.410004 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.410014 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 25 16:03:47 crc kubenswrapper[4743]: E1125 16:03:47.410032 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5593b1c-91e9-48c0-b348-cd0a46f64639" containerName="route-controller-manager" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.410041 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5593b1c-91e9-48c0-b348-cd0a46f64639" containerName="route-controller-manager" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.410357 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.410386 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e800807-1cef-4dcb-9001-48322127beb9" containerName="controller-manager" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.410403 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5593b1c-91e9-48c0-b348-cd0a46f64639" containerName="route-controller-manager" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.411112 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.433218 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85dbdfd575-48mmr"] Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.436114 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5593b1c-91e9-48c0-b348-cd0a46f64639-config\") pod \"c5593b1c-91e9-48c0-b348-cd0a46f64639\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.436161 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5593b1c-91e9-48c0-b348-cd0a46f64639-client-ca\") pod \"c5593b1c-91e9-48c0-b348-cd0a46f64639\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.436182 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5593b1c-91e9-48c0-b348-cd0a46f64639-serving-cert\") pod \"c5593b1c-91e9-48c0-b348-cd0a46f64639\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.436218 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svdqx\" (UniqueName: \"kubernetes.io/projected/2e800807-1cef-4dcb-9001-48322127beb9-kube-api-access-svdqx\") pod \"2e800807-1cef-4dcb-9001-48322127beb9\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.436237 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-config\") pod \"2e800807-1cef-4dcb-9001-48322127beb9\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.436258 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e800807-1cef-4dcb-9001-48322127beb9-serving-cert\") pod \"2e800807-1cef-4dcb-9001-48322127beb9\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.436285 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-proxy-ca-bundles\") pod \"2e800807-1cef-4dcb-9001-48322127beb9\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.436347 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl2rh\" (UniqueName: \"kubernetes.io/projected/c5593b1c-91e9-48c0-b348-cd0a46f64639-kube-api-access-gl2rh\") pod \"c5593b1c-91e9-48c0-b348-cd0a46f64639\" (UID: \"c5593b1c-91e9-48c0-b348-cd0a46f64639\") " Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.436367 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-client-ca\") pod \"2e800807-1cef-4dcb-9001-48322127beb9\" (UID: \"2e800807-1cef-4dcb-9001-48322127beb9\") " Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.436496 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-serving-cert\") pod \"controller-manager-85dbdfd575-48mmr\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.436550 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-config\") pod \"controller-manager-85dbdfd575-48mmr\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.436616 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-proxy-ca-bundles\") pod \"controller-manager-85dbdfd575-48mmr\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.436636 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89krt\" (UniqueName: \"kubernetes.io/projected/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-kube-api-access-89krt\") pod \"controller-manager-85dbdfd575-48mmr\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.436659 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-client-ca\") pod \"controller-manager-85dbdfd575-48mmr\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.438155 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5593b1c-91e9-48c0-b348-cd0a46f64639-client-ca" (OuterVolumeSpecName: "client-ca") pod "c5593b1c-91e9-48c0-b348-cd0a46f64639" (UID: "c5593b1c-91e9-48c0-b348-cd0a46f64639"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.438220 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-client-ca" (OuterVolumeSpecName: "client-ca") pod "2e800807-1cef-4dcb-9001-48322127beb9" (UID: "2e800807-1cef-4dcb-9001-48322127beb9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.438700 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-config" (OuterVolumeSpecName: "config") pod "2e800807-1cef-4dcb-9001-48322127beb9" (UID: "2e800807-1cef-4dcb-9001-48322127beb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.439904 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2e800807-1cef-4dcb-9001-48322127beb9" (UID: "2e800807-1cef-4dcb-9001-48322127beb9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.445347 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5593b1c-91e9-48c0-b348-cd0a46f64639-kube-api-access-gl2rh" (OuterVolumeSpecName: "kube-api-access-gl2rh") pod "c5593b1c-91e9-48c0-b348-cd0a46f64639" (UID: "c5593b1c-91e9-48c0-b348-cd0a46f64639"). InnerVolumeSpecName "kube-api-access-gl2rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.446371 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5593b1c-91e9-48c0-b348-cd0a46f64639-config" (OuterVolumeSpecName: "config") pod "c5593b1c-91e9-48c0-b348-cd0a46f64639" (UID: "c5593b1c-91e9-48c0-b348-cd0a46f64639"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.451194 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e800807-1cef-4dcb-9001-48322127beb9-kube-api-access-svdqx" (OuterVolumeSpecName: "kube-api-access-svdqx") pod "2e800807-1cef-4dcb-9001-48322127beb9" (UID: "2e800807-1cef-4dcb-9001-48322127beb9"). InnerVolumeSpecName "kube-api-access-svdqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.454063 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e800807-1cef-4dcb-9001-48322127beb9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2e800807-1cef-4dcb-9001-48322127beb9" (UID: "2e800807-1cef-4dcb-9001-48322127beb9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.455815 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5593b1c-91e9-48c0-b348-cd0a46f64639-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c5593b1c-91e9-48c0-b348-cd0a46f64639" (UID: "c5593b1c-91e9-48c0-b348-cd0a46f64639"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.472263 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92"] Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.472966 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.483694 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92"] Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.537958 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-proxy-ca-bundles\") pod \"controller-manager-85dbdfd575-48mmr\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538011 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89krt\" (UniqueName: \"kubernetes.io/projected/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-kube-api-access-89krt\") pod \"controller-manager-85dbdfd575-48mmr\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538039 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-client-ca\") pod \"controller-manager-85dbdfd575-48mmr\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538063 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5c730ab-8188-447b-aa09-957928e8fabb-config\") pod \"route-controller-manager-75679b8575-pwg92\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538082 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-serving-cert\") pod \"controller-manager-85dbdfd575-48mmr\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538109 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kx7p\" (UniqueName: \"kubernetes.io/projected/b5c730ab-8188-447b-aa09-957928e8fabb-kube-api-access-5kx7p\") pod \"route-controller-manager-75679b8575-pwg92\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538140 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5c730ab-8188-447b-aa09-957928e8fabb-client-ca\") pod \"route-controller-manager-75679b8575-pwg92\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538158 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-config\") pod \"controller-manager-85dbdfd575-48mmr\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538176 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5c730ab-8188-447b-aa09-957928e8fabb-serving-cert\") pod \"route-controller-manager-75679b8575-pwg92\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538234 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e800807-1cef-4dcb-9001-48322127beb9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538247 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538259 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl2rh\" (UniqueName: \"kubernetes.io/projected/c5593b1c-91e9-48c0-b348-cd0a46f64639-kube-api-access-gl2rh\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538271 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538283 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5593b1c-91e9-48c0-b348-cd0a46f64639-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538293 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5593b1c-91e9-48c0-b348-cd0a46f64639-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538303 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5593b1c-91e9-48c0-b348-cd0a46f64639-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538312 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svdqx\" (UniqueName: \"kubernetes.io/projected/2e800807-1cef-4dcb-9001-48322127beb9-kube-api-access-svdqx\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.538322 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e800807-1cef-4dcb-9001-48322127beb9-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.539711 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-client-ca\") pod \"controller-manager-85dbdfd575-48mmr\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.539934 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-proxy-ca-bundles\") pod \"controller-manager-85dbdfd575-48mmr\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.540326 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-config\") pod \"controller-manager-85dbdfd575-48mmr\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.542320 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-serving-cert\") pod \"controller-manager-85dbdfd575-48mmr\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.560953 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89krt\" (UniqueName: \"kubernetes.io/projected/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-kube-api-access-89krt\") pod \"controller-manager-85dbdfd575-48mmr\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.639505 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5c730ab-8188-447b-aa09-957928e8fabb-config\") pod \"route-controller-manager-75679b8575-pwg92\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.639586 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kx7p\" (UniqueName: \"kubernetes.io/projected/b5c730ab-8188-447b-aa09-957928e8fabb-kube-api-access-5kx7p\") pod \"route-controller-manager-75679b8575-pwg92\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.639638 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5c730ab-8188-447b-aa09-957928e8fabb-client-ca\") pod \"route-controller-manager-75679b8575-pwg92\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.639660 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5c730ab-8188-447b-aa09-957928e8fabb-serving-cert\") pod \"route-controller-manager-75679b8575-pwg92\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.640917 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5c730ab-8188-447b-aa09-957928e8fabb-client-ca\") pod \"route-controller-manager-75679b8575-pwg92\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.641136 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5c730ab-8188-447b-aa09-957928e8fabb-config\") pod \"route-controller-manager-75679b8575-pwg92\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.642795 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5c730ab-8188-447b-aa09-957928e8fabb-serving-cert\") pod \"route-controller-manager-75679b8575-pwg92\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.655068 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kx7p\" (UniqueName: \"kubernetes.io/projected/b5c730ab-8188-447b-aa09-957928e8fabb-kube-api-access-5kx7p\") pod \"route-controller-manager-75679b8575-pwg92\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.746199 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.791486 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.941422 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85dbdfd575-48mmr"] Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.966698 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92"] Nov 25 16:03:47 crc kubenswrapper[4743]: W1125 16:03:47.975164 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5c730ab_8188_447b_aa09_957928e8fabb.slice/crio-6a0eaffb0375b505b4da82b67e49fd4656a510284a129d67bc6cc26b9e64cb7e WatchSource:0}: Error finding container 6a0eaffb0375b505b4da82b67e49fd4656a510284a129d67bc6cc26b9e64cb7e: Status 404 returned error can't find the container with id 6a0eaffb0375b505b4da82b67e49fd4656a510284a129d67bc6cc26b9e64cb7e Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.987275 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" event={"ID":"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4","Type":"ContainerStarted","Data":"9e6352650f5daad2c7444da51dad28263e5ad68a8a046229d98294f8fce162dd"} Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.988135 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" event={"ID":"b5c730ab-8188-447b-aa09-957928e8fabb","Type":"ContainerStarted","Data":"6a0eaffb0375b505b4da82b67e49fd4656a510284a129d67bc6cc26b9e64cb7e"} Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.989654 4743 generic.go:334] "Generic (PLEG): container finished" podID="c5593b1c-91e9-48c0-b348-cd0a46f64639" containerID="583009a3548beff1624b9b1a3c4244fd7c5c08419eed47ee04c75dbc46d63a2d" exitCode=0 Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.989726 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.989744 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" event={"ID":"c5593b1c-91e9-48c0-b348-cd0a46f64639","Type":"ContainerDied","Data":"583009a3548beff1624b9b1a3c4244fd7c5c08419eed47ee04c75dbc46d63a2d"} Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.989796 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw" event={"ID":"c5593b1c-91e9-48c0-b348-cd0a46f64639","Type":"ContainerDied","Data":"3998ff0d5ccaadb4e3573a68f7642b085bffccabe932e1f016b791d01069cfb1"} Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.989816 4743 scope.go:117] "RemoveContainer" containerID="583009a3548beff1624b9b1a3c4244fd7c5c08419eed47ee04c75dbc46d63a2d" Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.991888 4743 generic.go:334] "Generic (PLEG): container finished" podID="2e800807-1cef-4dcb-9001-48322127beb9" containerID="7f1eb3b73cac8b2c2a6bf1e4cbc4bd707b96ee2d88cd49d2b404034c73ef3e66" exitCode=0 Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.991935 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" event={"ID":"2e800807-1cef-4dcb-9001-48322127beb9","Type":"ContainerDied","Data":"7f1eb3b73cac8b2c2a6bf1e4cbc4bd707b96ee2d88cd49d2b404034c73ef3e66"} Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.991961 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" event={"ID":"2e800807-1cef-4dcb-9001-48322127beb9","Type":"ContainerDied","Data":"fd6e47d2ae17f8d7662fae77bb4c1321c84f256995e1c30e526d533eed3f09b5"} Nov 25 16:03:47 crc kubenswrapper[4743]: I1125 16:03:47.991959 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5n9pg" Nov 25 16:03:48 crc kubenswrapper[4743]: I1125 16:03:48.010846 4743 scope.go:117] "RemoveContainer" containerID="583009a3548beff1624b9b1a3c4244fd7c5c08419eed47ee04c75dbc46d63a2d" Nov 25 16:03:48 crc kubenswrapper[4743]: E1125 16:03:48.011238 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"583009a3548beff1624b9b1a3c4244fd7c5c08419eed47ee04c75dbc46d63a2d\": container with ID starting with 583009a3548beff1624b9b1a3c4244fd7c5c08419eed47ee04c75dbc46d63a2d not found: ID does not exist" containerID="583009a3548beff1624b9b1a3c4244fd7c5c08419eed47ee04c75dbc46d63a2d" Nov 25 16:03:48 crc kubenswrapper[4743]: I1125 16:03:48.011275 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"583009a3548beff1624b9b1a3c4244fd7c5c08419eed47ee04c75dbc46d63a2d"} err="failed to get container status \"583009a3548beff1624b9b1a3c4244fd7c5c08419eed47ee04c75dbc46d63a2d\": rpc error: code = NotFound desc = could not find container \"583009a3548beff1624b9b1a3c4244fd7c5c08419eed47ee04c75dbc46d63a2d\": container with ID starting with 583009a3548beff1624b9b1a3c4244fd7c5c08419eed47ee04c75dbc46d63a2d not found: ID does not exist" Nov 25 16:03:48 crc kubenswrapper[4743]: I1125 16:03:48.011323 4743 scope.go:117] "RemoveContainer" containerID="7f1eb3b73cac8b2c2a6bf1e4cbc4bd707b96ee2d88cd49d2b404034c73ef3e66" Nov 25 16:03:48 crc kubenswrapper[4743]: I1125 16:03:48.013432 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw"] Nov 25 16:03:48 crc kubenswrapper[4743]: I1125 16:03:48.016935 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8xnw"] Nov 25 16:03:48 crc kubenswrapper[4743]: I1125 16:03:48.020829 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5n9pg"] Nov 25 16:03:48 crc kubenswrapper[4743]: I1125 16:03:48.023374 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5n9pg"] Nov 25 16:03:48 crc kubenswrapper[4743]: I1125 16:03:48.030793 4743 scope.go:117] "RemoveContainer" containerID="7f1eb3b73cac8b2c2a6bf1e4cbc4bd707b96ee2d88cd49d2b404034c73ef3e66" Nov 25 16:03:48 crc kubenswrapper[4743]: E1125 16:03:48.031243 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f1eb3b73cac8b2c2a6bf1e4cbc4bd707b96ee2d88cd49d2b404034c73ef3e66\": container with ID starting with 7f1eb3b73cac8b2c2a6bf1e4cbc4bd707b96ee2d88cd49d2b404034c73ef3e66 not found: ID does not exist" containerID="7f1eb3b73cac8b2c2a6bf1e4cbc4bd707b96ee2d88cd49d2b404034c73ef3e66" Nov 25 16:03:48 crc kubenswrapper[4743]: I1125 16:03:48.031276 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f1eb3b73cac8b2c2a6bf1e4cbc4bd707b96ee2d88cd49d2b404034c73ef3e66"} err="failed to get container status \"7f1eb3b73cac8b2c2a6bf1e4cbc4bd707b96ee2d88cd49d2b404034c73ef3e66\": rpc error: code = NotFound desc = could not find container \"7f1eb3b73cac8b2c2a6bf1e4cbc4bd707b96ee2d88cd49d2b404034c73ef3e66\": container with ID starting with 7f1eb3b73cac8b2c2a6bf1e4cbc4bd707b96ee2d88cd49d2b404034c73ef3e66 not found: ID does not exist" Nov 25 16:03:48 crc kubenswrapper[4743]: I1125 16:03:48.997767 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" event={"ID":"b5c730ab-8188-447b-aa09-957928e8fabb","Type":"ContainerStarted","Data":"9f5acea5a5fb45d6708cba8ddfc01007cbc4a54c2fc53334e0bcd66d1d52f1e9"} Nov 25 16:03:48 crc kubenswrapper[4743]: I1125 16:03:48.998284 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:49 crc kubenswrapper[4743]: I1125 16:03:49.001243 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" event={"ID":"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4","Type":"ContainerStarted","Data":"5741fbe9b76bde5385f886fab0cb171ecb2a14be90097520a97ff75fdbc21183"} Nov 25 16:03:49 crc kubenswrapper[4743]: I1125 16:03:49.001869 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:49 crc kubenswrapper[4743]: I1125 16:03:49.002894 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:03:49 crc kubenswrapper[4743]: I1125 16:03:49.005327 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:49 crc kubenswrapper[4743]: I1125 16:03:49.017631 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" podStartSLOduration=2.01756807 podStartE2EDuration="2.01756807s" podCreationTimestamp="2025-11-25 16:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:03:49.013452035 +0000 UTC m=+308.135291594" watchObservedRunningTime="2025-11-25 16:03:49.01756807 +0000 UTC m=+308.139407619" Nov 25 16:03:49 crc kubenswrapper[4743]: I1125 16:03:49.031380 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" podStartSLOduration=2.031359916 podStartE2EDuration="2.031359916s" podCreationTimestamp="2025-11-25 16:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:03:49.028909622 +0000 UTC m=+308.150749181" watchObservedRunningTime="2025-11-25 16:03:49.031359916 +0000 UTC m=+308.153199475" Nov 25 16:03:49 crc kubenswrapper[4743]: I1125 16:03:49.781298 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e800807-1cef-4dcb-9001-48322127beb9" path="/var/lib/kubelet/pods/2e800807-1cef-4dcb-9001-48322127beb9/volumes" Nov 25 16:03:49 crc kubenswrapper[4743]: I1125 16:03:49.782114 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5593b1c-91e9-48c0-b348-cd0a46f64639" path="/var/lib/kubelet/pods/c5593b1c-91e9-48c0-b348-cd0a46f64639/volumes" Nov 25 16:03:50 crc kubenswrapper[4743]: I1125 16:03:50.367741 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-85dbdfd575-48mmr"] Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.014740 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" podUID="c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4" containerName="controller-manager" containerID="cri-o://5741fbe9b76bde5385f886fab0cb171ecb2a14be90097520a97ff75fdbc21183" gracePeriod=30 Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.417107 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.448781 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f9454b8d-n26bg"] Nov 25 16:03:52 crc kubenswrapper[4743]: E1125 16:03:52.449096 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4" containerName="controller-manager" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.449118 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4" containerName="controller-manager" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.449230 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4" containerName="controller-manager" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.449722 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.452930 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f9454b8d-n26bg"] Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.499811 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-proxy-ca-bundles\") pod \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.499861 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89krt\" (UniqueName: \"kubernetes.io/projected/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-kube-api-access-89krt\") pod \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.499922 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-config\") pod \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.499945 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-serving-cert\") pod \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.499981 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-client-ca\") pod \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\" (UID: \"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4\") " Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.500095 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f881d491-0309-4082-a344-871bf86cf2cb-client-ca\") pod \"controller-manager-6f9454b8d-n26bg\" (UID: \"f881d491-0309-4082-a344-871bf86cf2cb\") " pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.500135 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6sm8\" (UniqueName: \"kubernetes.io/projected/f881d491-0309-4082-a344-871bf86cf2cb-kube-api-access-q6sm8\") pod \"controller-manager-6f9454b8d-n26bg\" (UID: \"f881d491-0309-4082-a344-871bf86cf2cb\") " pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.500169 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f881d491-0309-4082-a344-871bf86cf2cb-config\") pod \"controller-manager-6f9454b8d-n26bg\" (UID: \"f881d491-0309-4082-a344-871bf86cf2cb\") " pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.500195 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f881d491-0309-4082-a344-871bf86cf2cb-serving-cert\") pod \"controller-manager-6f9454b8d-n26bg\" (UID: \"f881d491-0309-4082-a344-871bf86cf2cb\") " pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.500210 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f881d491-0309-4082-a344-871bf86cf2cb-proxy-ca-bundles\") pod \"controller-manager-6f9454b8d-n26bg\" (UID: \"f881d491-0309-4082-a344-871bf86cf2cb\") " pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.500537 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-config" (OuterVolumeSpecName: "config") pod "c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4" (UID: "c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.500611 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4" (UID: "c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.500646 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-client-ca" (OuterVolumeSpecName: "client-ca") pod "c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4" (UID: "c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.504099 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-kube-api-access-89krt" (OuterVolumeSpecName: "kube-api-access-89krt") pod "c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4" (UID: "c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4"). InnerVolumeSpecName "kube-api-access-89krt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.504715 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4" (UID: "c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.769585 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6sm8\" (UniqueName: \"kubernetes.io/projected/f881d491-0309-4082-a344-871bf86cf2cb-kube-api-access-q6sm8\") pod \"controller-manager-6f9454b8d-n26bg\" (UID: \"f881d491-0309-4082-a344-871bf86cf2cb\") " pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.769692 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f881d491-0309-4082-a344-871bf86cf2cb-config\") pod \"controller-manager-6f9454b8d-n26bg\" (UID: \"f881d491-0309-4082-a344-871bf86cf2cb\") " pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.769736 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f881d491-0309-4082-a344-871bf86cf2cb-serving-cert\") pod \"controller-manager-6f9454b8d-n26bg\" (UID: \"f881d491-0309-4082-a344-871bf86cf2cb\") " pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.769754 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f881d491-0309-4082-a344-871bf86cf2cb-proxy-ca-bundles\") pod \"controller-manager-6f9454b8d-n26bg\" (UID: \"f881d491-0309-4082-a344-871bf86cf2cb\") " pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.769786 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f881d491-0309-4082-a344-871bf86cf2cb-client-ca\") pod \"controller-manager-6f9454b8d-n26bg\" (UID: \"f881d491-0309-4082-a344-871bf86cf2cb\") " pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.772206 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f881d491-0309-4082-a344-871bf86cf2cb-proxy-ca-bundles\") pod \"controller-manager-6f9454b8d-n26bg\" (UID: \"f881d491-0309-4082-a344-871bf86cf2cb\") " pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.772349 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f881d491-0309-4082-a344-871bf86cf2cb-config\") pod \"controller-manager-6f9454b8d-n26bg\" (UID: \"f881d491-0309-4082-a344-871bf86cf2cb\") " pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.772705 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f881d491-0309-4082-a344-871bf86cf2cb-client-ca\") pod \"controller-manager-6f9454b8d-n26bg\" (UID: \"f881d491-0309-4082-a344-871bf86cf2cb\") " pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.772815 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.772853 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89krt\" (UniqueName: \"kubernetes.io/projected/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-kube-api-access-89krt\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.772874 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.772890 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.772909 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.774277 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f881d491-0309-4082-a344-871bf86cf2cb-serving-cert\") pod \"controller-manager-6f9454b8d-n26bg\" (UID: \"f881d491-0309-4082-a344-871bf86cf2cb\") " pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:52 crc kubenswrapper[4743]: I1125 16:03:52.787096 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6sm8\" (UniqueName: \"kubernetes.io/projected/f881d491-0309-4082-a344-871bf86cf2cb-kube-api-access-q6sm8\") pod \"controller-manager-6f9454b8d-n26bg\" (UID: \"f881d491-0309-4082-a344-871bf86cf2cb\") " pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:53 crc kubenswrapper[4743]: I1125 16:03:53.020452 4743 generic.go:334] "Generic (PLEG): container finished" podID="c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4" containerID="5741fbe9b76bde5385f886fab0cb171ecb2a14be90097520a97ff75fdbc21183" exitCode=0 Nov 25 16:03:53 crc kubenswrapper[4743]: I1125 16:03:53.020533 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" Nov 25 16:03:53 crc kubenswrapper[4743]: I1125 16:03:53.020530 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" event={"ID":"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4","Type":"ContainerDied","Data":"5741fbe9b76bde5385f886fab0cb171ecb2a14be90097520a97ff75fdbc21183"} Nov 25 16:03:53 crc kubenswrapper[4743]: I1125 16:03:53.020716 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85dbdfd575-48mmr" event={"ID":"c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4","Type":"ContainerDied","Data":"9e6352650f5daad2c7444da51dad28263e5ad68a8a046229d98294f8fce162dd"} Nov 25 16:03:53 crc kubenswrapper[4743]: I1125 16:03:53.020993 4743 scope.go:117] "RemoveContainer" containerID="5741fbe9b76bde5385f886fab0cb171ecb2a14be90097520a97ff75fdbc21183" Nov 25 16:03:53 crc kubenswrapper[4743]: I1125 16:03:53.035340 4743 scope.go:117] "RemoveContainer" containerID="5741fbe9b76bde5385f886fab0cb171ecb2a14be90097520a97ff75fdbc21183" Nov 25 16:03:53 crc kubenswrapper[4743]: E1125 16:03:53.035682 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5741fbe9b76bde5385f886fab0cb171ecb2a14be90097520a97ff75fdbc21183\": container with ID starting with 5741fbe9b76bde5385f886fab0cb171ecb2a14be90097520a97ff75fdbc21183 not found: ID does not exist" containerID="5741fbe9b76bde5385f886fab0cb171ecb2a14be90097520a97ff75fdbc21183" Nov 25 16:03:53 crc kubenswrapper[4743]: I1125 16:03:53.035718 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5741fbe9b76bde5385f886fab0cb171ecb2a14be90097520a97ff75fdbc21183"} err="failed to get container status \"5741fbe9b76bde5385f886fab0cb171ecb2a14be90097520a97ff75fdbc21183\": rpc error: code = NotFound desc = could not find container \"5741fbe9b76bde5385f886fab0cb171ecb2a14be90097520a97ff75fdbc21183\": container with ID starting with 5741fbe9b76bde5385f886fab0cb171ecb2a14be90097520a97ff75fdbc21183 not found: ID does not exist" Nov 25 16:03:53 crc kubenswrapper[4743]: I1125 16:03:53.049285 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-85dbdfd575-48mmr"] Nov 25 16:03:53 crc kubenswrapper[4743]: I1125 16:03:53.052673 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-85dbdfd575-48mmr"] Nov 25 16:03:53 crc kubenswrapper[4743]: I1125 16:03:53.068301 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:53 crc kubenswrapper[4743]: I1125 16:03:53.227585 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f9454b8d-n26bg"] Nov 25 16:03:53 crc kubenswrapper[4743]: I1125 16:03:53.782305 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4" path="/var/lib/kubelet/pods/c5e1846d-fe4b-40e6-9ecf-84b6b9b12ef4/volumes" Nov 25 16:03:54 crc kubenswrapper[4743]: I1125 16:03:54.027054 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" event={"ID":"f881d491-0309-4082-a344-871bf86cf2cb","Type":"ContainerStarted","Data":"077ad65269ced5491dc2a720bccaa087559016ae0f3f056a87c1c5a1d2da3bea"} Nov 25 16:03:54 crc kubenswrapper[4743]: I1125 16:03:54.027719 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:54 crc kubenswrapper[4743]: I1125 16:03:54.027753 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" event={"ID":"f881d491-0309-4082-a344-871bf86cf2cb","Type":"ContainerStarted","Data":"3437004e30ba1c3c0c6623112f17fdaab76bf60e45e5eaf192c37351dd891d4f"} Nov 25 16:03:54 crc kubenswrapper[4743]: I1125 16:03:54.032619 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" Nov 25 16:03:54 crc kubenswrapper[4743]: I1125 16:03:54.044060 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f9454b8d-n26bg" podStartSLOduration=4.044034941 podStartE2EDuration="4.044034941s" podCreationTimestamp="2025-11-25 16:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:03:54.042349301 +0000 UTC m=+313.164188860" watchObservedRunningTime="2025-11-25 16:03:54.044034941 +0000 UTC m=+313.165874510" Nov 25 16:04:20 crc kubenswrapper[4743]: I1125 16:04:20.077081 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:04:20 crc kubenswrapper[4743]: I1125 16:04:20.077572 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:04:26 crc kubenswrapper[4743]: I1125 16:04:26.923763 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92"] Nov 25 16:04:26 crc kubenswrapper[4743]: I1125 16:04:26.924482 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" podUID="b5c730ab-8188-447b-aa09-957928e8fabb" containerName="route-controller-manager" containerID="cri-o://9f5acea5a5fb45d6708cba8ddfc01007cbc4a54c2fc53334e0bcd66d1d52f1e9" gracePeriod=30 Nov 25 16:04:27 crc kubenswrapper[4743]: I1125 16:04:27.204377 4743 generic.go:334] "Generic (PLEG): container finished" podID="b5c730ab-8188-447b-aa09-957928e8fabb" containerID="9f5acea5a5fb45d6708cba8ddfc01007cbc4a54c2fc53334e0bcd66d1d52f1e9" exitCode=0 Nov 25 16:04:27 crc kubenswrapper[4743]: I1125 16:04:27.204418 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" event={"ID":"b5c730ab-8188-447b-aa09-957928e8fabb","Type":"ContainerDied","Data":"9f5acea5a5fb45d6708cba8ddfc01007cbc4a54c2fc53334e0bcd66d1d52f1e9"} Nov 25 16:04:27 crc kubenswrapper[4743]: I1125 16:04:27.350755 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:04:27 crc kubenswrapper[4743]: I1125 16:04:27.507561 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5c730ab-8188-447b-aa09-957928e8fabb-client-ca\") pod \"b5c730ab-8188-447b-aa09-957928e8fabb\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " Nov 25 16:04:27 crc kubenswrapper[4743]: I1125 16:04:27.507661 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5c730ab-8188-447b-aa09-957928e8fabb-config\") pod \"b5c730ab-8188-447b-aa09-957928e8fabb\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " Nov 25 16:04:27 crc kubenswrapper[4743]: I1125 16:04:27.507698 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5c730ab-8188-447b-aa09-957928e8fabb-serving-cert\") pod \"b5c730ab-8188-447b-aa09-957928e8fabb\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " Nov 25 16:04:27 crc kubenswrapper[4743]: I1125 16:04:27.507739 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kx7p\" (UniqueName: \"kubernetes.io/projected/b5c730ab-8188-447b-aa09-957928e8fabb-kube-api-access-5kx7p\") pod \"b5c730ab-8188-447b-aa09-957928e8fabb\" (UID: \"b5c730ab-8188-447b-aa09-957928e8fabb\") " Nov 25 16:04:27 crc kubenswrapper[4743]: I1125 16:04:27.508485 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c730ab-8188-447b-aa09-957928e8fabb-client-ca" (OuterVolumeSpecName: "client-ca") pod "b5c730ab-8188-447b-aa09-957928e8fabb" (UID: "b5c730ab-8188-447b-aa09-957928e8fabb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:04:27 crc kubenswrapper[4743]: I1125 16:04:27.508708 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c730ab-8188-447b-aa09-957928e8fabb-config" (OuterVolumeSpecName: "config") pod "b5c730ab-8188-447b-aa09-957928e8fabb" (UID: "b5c730ab-8188-447b-aa09-957928e8fabb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:04:27 crc kubenswrapper[4743]: I1125 16:04:27.513457 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5c730ab-8188-447b-aa09-957928e8fabb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b5c730ab-8188-447b-aa09-957928e8fabb" (UID: "b5c730ab-8188-447b-aa09-957928e8fabb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:04:27 crc kubenswrapper[4743]: I1125 16:04:27.513498 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c730ab-8188-447b-aa09-957928e8fabb-kube-api-access-5kx7p" (OuterVolumeSpecName: "kube-api-access-5kx7p") pod "b5c730ab-8188-447b-aa09-957928e8fabb" (UID: "b5c730ab-8188-447b-aa09-957928e8fabb"). InnerVolumeSpecName "kube-api-access-5kx7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:04:27 crc kubenswrapper[4743]: I1125 16:04:27.609362 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5c730ab-8188-447b-aa09-957928e8fabb-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 16:04:27 crc kubenswrapper[4743]: I1125 16:04:27.609407 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5c730ab-8188-447b-aa09-957928e8fabb-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:04:27 crc kubenswrapper[4743]: I1125 16:04:27.609419 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5c730ab-8188-447b-aa09-957928e8fabb-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 16:04:27 crc kubenswrapper[4743]: I1125 16:04:27.609449 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kx7p\" (UniqueName: \"kubernetes.io/projected/b5c730ab-8188-447b-aa09-957928e8fabb-kube-api-access-5kx7p\") on node \"crc\" DevicePath \"\"" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.212778 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" event={"ID":"b5c730ab-8188-447b-aa09-957928e8fabb","Type":"ContainerDied","Data":"6a0eaffb0375b505b4da82b67e49fd4656a510284a129d67bc6cc26b9e64cb7e"} Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.212837 4743 scope.go:117] "RemoveContainer" containerID="9f5acea5a5fb45d6708cba8ddfc01007cbc4a54c2fc53334e0bcd66d1d52f1e9" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.212986 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.235220 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92"] Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.240775 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75679b8575-pwg92"] Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.778641 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz"] Nov 25 16:04:28 crc kubenswrapper[4743]: E1125 16:04:28.779117 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c730ab-8188-447b-aa09-957928e8fabb" containerName="route-controller-manager" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.779134 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c730ab-8188-447b-aa09-957928e8fabb" containerName="route-controller-manager" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.779238 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c730ab-8188-447b-aa09-957928e8fabb" containerName="route-controller-manager" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.779653 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.781140 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.781527 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.782872 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.783045 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.783069 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.783337 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.788869 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz"] Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.823895 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0502abed-a0ad-4eb8-883f-fcad2df75c22-client-ca\") pod \"route-controller-manager-5c5cddd6fb-5gszz\" (UID: \"0502abed-a0ad-4eb8-883f-fcad2df75c22\") " pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.823955 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xfcv\" (UniqueName: \"kubernetes.io/projected/0502abed-a0ad-4eb8-883f-fcad2df75c22-kube-api-access-8xfcv\") pod \"route-controller-manager-5c5cddd6fb-5gszz\" (UID: \"0502abed-a0ad-4eb8-883f-fcad2df75c22\") " pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.824046 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0502abed-a0ad-4eb8-883f-fcad2df75c22-serving-cert\") pod \"route-controller-manager-5c5cddd6fb-5gszz\" (UID: \"0502abed-a0ad-4eb8-883f-fcad2df75c22\") " pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.824093 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0502abed-a0ad-4eb8-883f-fcad2df75c22-config\") pod \"route-controller-manager-5c5cddd6fb-5gszz\" (UID: \"0502abed-a0ad-4eb8-883f-fcad2df75c22\") " pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.924829 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0502abed-a0ad-4eb8-883f-fcad2df75c22-client-ca\") pod \"route-controller-manager-5c5cddd6fb-5gszz\" (UID: \"0502abed-a0ad-4eb8-883f-fcad2df75c22\") " pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.924885 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xfcv\" (UniqueName: \"kubernetes.io/projected/0502abed-a0ad-4eb8-883f-fcad2df75c22-kube-api-access-8xfcv\") pod \"route-controller-manager-5c5cddd6fb-5gszz\" (UID: \"0502abed-a0ad-4eb8-883f-fcad2df75c22\") " pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.924937 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0502abed-a0ad-4eb8-883f-fcad2df75c22-serving-cert\") pod \"route-controller-manager-5c5cddd6fb-5gszz\" (UID: \"0502abed-a0ad-4eb8-883f-fcad2df75c22\") " pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.924970 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0502abed-a0ad-4eb8-883f-fcad2df75c22-config\") pod \"route-controller-manager-5c5cddd6fb-5gszz\" (UID: \"0502abed-a0ad-4eb8-883f-fcad2df75c22\") " pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.926487 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0502abed-a0ad-4eb8-883f-fcad2df75c22-client-ca\") pod \"route-controller-manager-5c5cddd6fb-5gszz\" (UID: \"0502abed-a0ad-4eb8-883f-fcad2df75c22\") " pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.926786 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0502abed-a0ad-4eb8-883f-fcad2df75c22-config\") pod \"route-controller-manager-5c5cddd6fb-5gszz\" (UID: \"0502abed-a0ad-4eb8-883f-fcad2df75c22\") " pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.929959 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0502abed-a0ad-4eb8-883f-fcad2df75c22-serving-cert\") pod \"route-controller-manager-5c5cddd6fb-5gszz\" (UID: \"0502abed-a0ad-4eb8-883f-fcad2df75c22\") " pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:28 crc kubenswrapper[4743]: I1125 16:04:28.940199 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xfcv\" (UniqueName: \"kubernetes.io/projected/0502abed-a0ad-4eb8-883f-fcad2df75c22-kube-api-access-8xfcv\") pod \"route-controller-manager-5c5cddd6fb-5gszz\" (UID: \"0502abed-a0ad-4eb8-883f-fcad2df75c22\") " pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:29 crc kubenswrapper[4743]: I1125 16:04:29.096139 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:29 crc kubenswrapper[4743]: I1125 16:04:29.463503 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz"] Nov 25 16:04:29 crc kubenswrapper[4743]: I1125 16:04:29.781764 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5c730ab-8188-447b-aa09-957928e8fabb" path="/var/lib/kubelet/pods/b5c730ab-8188-447b-aa09-957928e8fabb/volumes" Nov 25 16:04:30 crc kubenswrapper[4743]: I1125 16:04:30.223872 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" event={"ID":"0502abed-a0ad-4eb8-883f-fcad2df75c22","Type":"ContainerStarted","Data":"4b8002dacb16168f13972c2456f3adbeafb7ba59c1a17f48ce435a1c8634efc7"} Nov 25 16:04:30 crc kubenswrapper[4743]: I1125 16:04:30.224189 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" event={"ID":"0502abed-a0ad-4eb8-883f-fcad2df75c22","Type":"ContainerStarted","Data":"8778f0b293dcdcde72dc9f229b22013192949a92422f7fb0ce1f837d55f445bc"} Nov 25 16:04:30 crc kubenswrapper[4743]: I1125 16:04:30.226027 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:30 crc kubenswrapper[4743]: I1125 16:04:30.230453 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" Nov 25 16:04:30 crc kubenswrapper[4743]: I1125 16:04:30.241704 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c5cddd6fb-5gszz" podStartSLOduration=4.241685418 podStartE2EDuration="4.241685418s" podCreationTimestamp="2025-11-25 16:04:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:04:30.24067006 +0000 UTC m=+349.362509609" watchObservedRunningTime="2025-11-25 16:04:30.241685418 +0000 UTC m=+349.363524967" Nov 25 16:04:50 crc kubenswrapper[4743]: I1125 16:04:50.077908 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:04:50 crc kubenswrapper[4743]: I1125 16:04:50.078509 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:05:20 crc kubenswrapper[4743]: I1125 16:05:20.077324 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:05:20 crc kubenswrapper[4743]: I1125 16:05:20.077799 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:05:20 crc kubenswrapper[4743]: I1125 16:05:20.077837 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 16:05:20 crc kubenswrapper[4743]: I1125 16:05:20.078317 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"15aa768991dd9857b9f4f50b7025dcc102ea45bed548518b0bfc1c7f52a875e4"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:05:20 crc kubenswrapper[4743]: I1125 16:05:20.078371 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://15aa768991dd9857b9f4f50b7025dcc102ea45bed548518b0bfc1c7f52a875e4" gracePeriod=600 Nov 25 16:05:20 crc kubenswrapper[4743]: I1125 16:05:20.460496 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="15aa768991dd9857b9f4f50b7025dcc102ea45bed548518b0bfc1c7f52a875e4" exitCode=0 Nov 25 16:05:20 crc kubenswrapper[4743]: I1125 16:05:20.460555 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"15aa768991dd9857b9f4f50b7025dcc102ea45bed548518b0bfc1c7f52a875e4"} Nov 25 16:05:20 crc kubenswrapper[4743]: I1125 16:05:20.460857 4743 scope.go:117] "RemoveContainer" containerID="4a5aa1960d3775b276393241748cad12ac1af6a208722a3acd152d3b2912063c" Nov 25 16:05:21 crc kubenswrapper[4743]: I1125 16:05:21.467061 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"40809af90426567c28633f54d9909efc25bbfed89b36875fb90d82bda2d56570"} Nov 25 16:07:20 crc kubenswrapper[4743]: I1125 16:07:20.077857 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:07:20 crc kubenswrapper[4743]: I1125 16:07:20.078448 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:07:50 crc kubenswrapper[4743]: I1125 16:07:50.077520 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:07:50 crc kubenswrapper[4743]: I1125 16:07:50.078232 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:08:20 crc kubenswrapper[4743]: I1125 16:08:20.077538 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:08:20 crc kubenswrapper[4743]: I1125 16:08:20.078698 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:08:20 crc kubenswrapper[4743]: I1125 16:08:20.078796 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 16:08:20 crc kubenswrapper[4743]: I1125 16:08:20.079850 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"40809af90426567c28633f54d9909efc25bbfed89b36875fb90d82bda2d56570"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:08:20 crc kubenswrapper[4743]: I1125 16:08:20.079945 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://40809af90426567c28633f54d9909efc25bbfed89b36875fb90d82bda2d56570" gracePeriod=600 Nov 25 16:08:20 crc kubenswrapper[4743]: I1125 16:08:20.797599 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="40809af90426567c28633f54d9909efc25bbfed89b36875fb90d82bda2d56570" exitCode=0 Nov 25 16:08:20 crc kubenswrapper[4743]: I1125 16:08:20.797689 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"40809af90426567c28633f54d9909efc25bbfed89b36875fb90d82bda2d56570"} Nov 25 16:08:20 crc kubenswrapper[4743]: I1125 16:08:20.798578 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"cb34634469d7134691d6897c55890dcb1975da56bf10ed444c930c92d7b2c025"} Nov 25 16:08:20 crc kubenswrapper[4743]: I1125 16:08:20.798657 4743 scope.go:117] "RemoveContainer" containerID="15aa768991dd9857b9f4f50b7025dcc102ea45bed548518b0bfc1c7f52a875e4" Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.818583 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-s8482"] Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.821277 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-s8482" Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.823271 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-krhrz" Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.823503 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.823975 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.828379 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-cn9d5"] Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.829392 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-cn9d5" Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.831873 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-s8482"] Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.832717 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-qb4h9" Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.842318 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-cn9d5"] Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.858602 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-clc4q"] Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.859399 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-clc4q" Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.861766 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-sgs5n" Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.875855 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-clc4q"] Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.927858 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh4js\" (UniqueName: \"kubernetes.io/projected/779c1d2b-063b-413b-80c1-63c1b5438aff-kube-api-access-gh4js\") pod \"cert-manager-5b446d88c5-cn9d5\" (UID: \"779c1d2b-063b-413b-80c1-63c1b5438aff\") " pod="cert-manager/cert-manager-5b446d88c5-cn9d5" Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.927944 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49fwd\" (UniqueName: \"kubernetes.io/projected/6afbc225-6b21-4fe7-80a6-9fe85ffcac89-kube-api-access-49fwd\") pod \"cert-manager-cainjector-7f985d654d-s8482\" (UID: \"6afbc225-6b21-4fe7-80a6-9fe85ffcac89\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-s8482" Nov 25 16:08:52 crc kubenswrapper[4743]: I1125 16:08:52.928003 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wf44\" (UniqueName: \"kubernetes.io/projected/9e38fcf4-5a14-44c4-b8ad-970d07e82284-kube-api-access-2wf44\") pod \"cert-manager-webhook-5655c58dd6-clc4q\" (UID: \"9e38fcf4-5a14-44c4-b8ad-970d07e82284\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-clc4q" Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.029222 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wf44\" (UniqueName: \"kubernetes.io/projected/9e38fcf4-5a14-44c4-b8ad-970d07e82284-kube-api-access-2wf44\") pod \"cert-manager-webhook-5655c58dd6-clc4q\" (UID: \"9e38fcf4-5a14-44c4-b8ad-970d07e82284\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-clc4q" Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.029310 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh4js\" (UniqueName: \"kubernetes.io/projected/779c1d2b-063b-413b-80c1-63c1b5438aff-kube-api-access-gh4js\") pod \"cert-manager-5b446d88c5-cn9d5\" (UID: \"779c1d2b-063b-413b-80c1-63c1b5438aff\") " pod="cert-manager/cert-manager-5b446d88c5-cn9d5" Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.029355 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49fwd\" (UniqueName: \"kubernetes.io/projected/6afbc225-6b21-4fe7-80a6-9fe85ffcac89-kube-api-access-49fwd\") pod \"cert-manager-cainjector-7f985d654d-s8482\" (UID: \"6afbc225-6b21-4fe7-80a6-9fe85ffcac89\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-s8482" Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.047926 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wf44\" (UniqueName: \"kubernetes.io/projected/9e38fcf4-5a14-44c4-b8ad-970d07e82284-kube-api-access-2wf44\") pod \"cert-manager-webhook-5655c58dd6-clc4q\" (UID: \"9e38fcf4-5a14-44c4-b8ad-970d07e82284\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-clc4q" Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.047924 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49fwd\" (UniqueName: \"kubernetes.io/projected/6afbc225-6b21-4fe7-80a6-9fe85ffcac89-kube-api-access-49fwd\") pod \"cert-manager-cainjector-7f985d654d-s8482\" (UID: \"6afbc225-6b21-4fe7-80a6-9fe85ffcac89\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-s8482" Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.049160 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh4js\" (UniqueName: \"kubernetes.io/projected/779c1d2b-063b-413b-80c1-63c1b5438aff-kube-api-access-gh4js\") pod \"cert-manager-5b446d88c5-cn9d5\" (UID: \"779c1d2b-063b-413b-80c1-63c1b5438aff\") " pod="cert-manager/cert-manager-5b446d88c5-cn9d5" Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.139332 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-s8482" Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.148497 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-cn9d5" Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.174584 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-clc4q" Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.543855 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-s8482"] Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.551212 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.576770 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-cn9d5"] Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.580228 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-clc4q"] Nov 25 16:08:53 crc kubenswrapper[4743]: W1125 16:08:53.581271 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod779c1d2b_063b_413b_80c1_63c1b5438aff.slice/crio-8db9e1fea96fa99a7d9a66637d650947cc9716cd4ddec2340ac21316bf5ce559 WatchSource:0}: Error finding container 8db9e1fea96fa99a7d9a66637d650947cc9716cd4ddec2340ac21316bf5ce559: Status 404 returned error can't find the container with id 8db9e1fea96fa99a7d9a66637d650947cc9716cd4ddec2340ac21316bf5ce559 Nov 25 16:08:53 crc kubenswrapper[4743]: W1125 16:08:53.585430 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e38fcf4_5a14_44c4_b8ad_970d07e82284.slice/crio-4184728035b3400afef45a8a320ee4fa9c97222eea2d1b20d3fbef95c19aa431 WatchSource:0}: Error finding container 4184728035b3400afef45a8a320ee4fa9c97222eea2d1b20d3fbef95c19aa431: Status 404 returned error can't find the container with id 4184728035b3400afef45a8a320ee4fa9c97222eea2d1b20d3fbef95c19aa431 Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.969817 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-cn9d5" event={"ID":"779c1d2b-063b-413b-80c1-63c1b5438aff","Type":"ContainerStarted","Data":"8db9e1fea96fa99a7d9a66637d650947cc9716cd4ddec2340ac21316bf5ce559"} Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.970820 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-clc4q" event={"ID":"9e38fcf4-5a14-44c4-b8ad-970d07e82284","Type":"ContainerStarted","Data":"4184728035b3400afef45a8a320ee4fa9c97222eea2d1b20d3fbef95c19aa431"} Nov 25 16:08:53 crc kubenswrapper[4743]: I1125 16:08:53.971726 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-s8482" event={"ID":"6afbc225-6b21-4fe7-80a6-9fe85ffcac89","Type":"ContainerStarted","Data":"88d8b1155abe4085261ed72e4d397d327660b95f8269519f58f9e9209247190a"} Nov 25 16:08:56 crc kubenswrapper[4743]: I1125 16:08:56.988120 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-clc4q" event={"ID":"9e38fcf4-5a14-44c4-b8ad-970d07e82284","Type":"ContainerStarted","Data":"326a8ae47a85616d4cbecbfd41037f8701ff0b0a95c2d6ed27a27750bd2d5f40"} Nov 25 16:08:56 crc kubenswrapper[4743]: I1125 16:08:56.988951 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-clc4q" Nov 25 16:08:56 crc kubenswrapper[4743]: I1125 16:08:56.989845 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-s8482" event={"ID":"6afbc225-6b21-4fe7-80a6-9fe85ffcac89","Type":"ContainerStarted","Data":"dad4cc68555aaba5dfa4e52a860c1dc22ce3d238b03e7a475c0981f0b7dc9bf7"} Nov 25 16:08:56 crc kubenswrapper[4743]: I1125 16:08:56.991364 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-cn9d5" event={"ID":"779c1d2b-063b-413b-80c1-63c1b5438aff","Type":"ContainerStarted","Data":"4cdfd5d886b482e826f7f33e3cf775adada8600a53c846946a1abc16e78c1fdd"} Nov 25 16:08:57 crc kubenswrapper[4743]: I1125 16:08:57.014073 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-clc4q" podStartSLOduration=2.312328328 podStartE2EDuration="5.014054319s" podCreationTimestamp="2025-11-25 16:08:52 +0000 UTC" firstStartedPulling="2025-11-25 16:08:53.587544801 +0000 UTC m=+612.709384350" lastFinishedPulling="2025-11-25 16:08:56.289270782 +0000 UTC m=+615.411110341" observedRunningTime="2025-11-25 16:08:57.00275987 +0000 UTC m=+616.124599429" watchObservedRunningTime="2025-11-25 16:08:57.014054319 +0000 UTC m=+616.135893868" Nov 25 16:08:57 crc kubenswrapper[4743]: I1125 16:08:57.016206 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-s8482" podStartSLOduration=2.277172744 podStartE2EDuration="5.016199955s" podCreationTimestamp="2025-11-25 16:08:52 +0000 UTC" firstStartedPulling="2025-11-25 16:08:53.550975123 +0000 UTC m=+612.672814672" lastFinishedPulling="2025-11-25 16:08:56.290002334 +0000 UTC m=+615.411841883" observedRunningTime="2025-11-25 16:08:57.013531543 +0000 UTC m=+616.135371092" watchObservedRunningTime="2025-11-25 16:08:57.016199955 +0000 UTC m=+616.138039504" Nov 25 16:08:57 crc kubenswrapper[4743]: I1125 16:08:57.026970 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-cn9d5" podStartSLOduration=2.245732293 podStartE2EDuration="5.026950926s" podCreationTimestamp="2025-11-25 16:08:52 +0000 UTC" firstStartedPulling="2025-11-25 16:08:53.584293271 +0000 UTC m=+612.706132810" lastFinishedPulling="2025-11-25 16:08:56.365511904 +0000 UTC m=+615.487351443" observedRunningTime="2025-11-25 16:08:57.024502211 +0000 UTC m=+616.146341760" watchObservedRunningTime="2025-11-25 16:08:57.026950926 +0000 UTC m=+616.148790475" Nov 25 16:09:02 crc kubenswrapper[4743]: I1125 16:09:02.826167 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pbbjc"] Nov 25 16:09:02 crc kubenswrapper[4743]: I1125 16:09:02.827102 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovn-controller" containerID="cri-o://a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d" gracePeriod=30 Nov 25 16:09:02 crc kubenswrapper[4743]: I1125 16:09:02.827475 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="kube-rbac-proxy-node" containerID="cri-o://0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669" gracePeriod=30 Nov 25 16:09:02 crc kubenswrapper[4743]: I1125 16:09:02.827533 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="nbdb" containerID="cri-o://42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560" gracePeriod=30 Nov 25 16:09:02 crc kubenswrapper[4743]: I1125 16:09:02.827516 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f" gracePeriod=30 Nov 25 16:09:02 crc kubenswrapper[4743]: I1125 16:09:02.827572 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovn-acl-logging" containerID="cri-o://42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8" gracePeriod=30 Nov 25 16:09:02 crc kubenswrapper[4743]: I1125 16:09:02.827650 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="northd" containerID="cri-o://328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45" gracePeriod=30 Nov 25 16:09:02 crc kubenswrapper[4743]: I1125 16:09:02.827615 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="sbdb" containerID="cri-o://1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d" gracePeriod=30 Nov 25 16:09:02 crc kubenswrapper[4743]: I1125 16:09:02.868315 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" containerID="cri-o://5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9" gracePeriod=30 Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.023903 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n2r2l_2175b34c-5202-4e94-af0e-2f879b98c0bc/kube-multus/2.log" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.024454 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n2r2l_2175b34c-5202-4e94-af0e-2f879b98c0bc/kube-multus/1.log" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.024498 4743 generic.go:334] "Generic (PLEG): container finished" podID="2175b34c-5202-4e94-af0e-2f879b98c0bc" containerID="3d369a5a2f039f3e389943653b6aee59a909ef60ee893769d64b9c6016f61900" exitCode=2 Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.024560 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n2r2l" event={"ID":"2175b34c-5202-4e94-af0e-2f879b98c0bc","Type":"ContainerDied","Data":"3d369a5a2f039f3e389943653b6aee59a909ef60ee893769d64b9c6016f61900"} Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.024631 4743 scope.go:117] "RemoveContainer" containerID="47d4adf248256da18201eea949e15ec4471560028e06db3be072d6325667fc19" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.025236 4743 scope.go:117] "RemoveContainer" containerID="3d369a5a2f039f3e389943653b6aee59a909ef60ee893769d64b9c6016f61900" Nov 25 16:09:03 crc kubenswrapper[4743]: E1125 16:09:03.025565 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-n2r2l_openshift-multus(2175b34c-5202-4e94-af0e-2f879b98c0bc)\"" pod="openshift-multus/multus-n2r2l" podUID="2175b34c-5202-4e94-af0e-2f879b98c0bc" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.027245 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovnkube-controller/3.log" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.030006 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovn-acl-logging/0.log" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.030653 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovn-controller/0.log" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.031651 4743 generic.go:334] "Generic (PLEG): container finished" podID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerID="5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9" exitCode=0 Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.031677 4743 generic.go:334] "Generic (PLEG): container finished" podID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerID="0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f" exitCode=0 Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.031687 4743 generic.go:334] "Generic (PLEG): container finished" podID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerID="0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669" exitCode=0 Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.031697 4743 generic.go:334] "Generic (PLEG): container finished" podID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerID="42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8" exitCode=143 Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.031707 4743 generic.go:334] "Generic (PLEG): container finished" podID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerID="a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d" exitCode=143 Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.031727 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerDied","Data":"5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9"} Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.031754 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerDied","Data":"0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f"} Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.031767 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerDied","Data":"0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669"} Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.031781 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerDied","Data":"42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8"} Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.031794 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerDied","Data":"a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d"} Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.141620 4743 scope.go:117] "RemoveContainer" containerID="30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3" Nov 25 16:09:03 crc kubenswrapper[4743]: E1125 16:09:03.176971 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3\": container with ID starting with 30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3 not found: ID does not exist" containerID="30c1dc24785b20b6b99c90aad4101b47b2def037689276667e6c364bfebbfea3" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.177495 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-clc4q" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.182203 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovn-acl-logging/0.log" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.182650 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovn-controller/0.log" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.182974 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.252547 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w4v9c"] Nov 25 16:09:03 crc kubenswrapper[4743]: E1125 16:09:03.252868 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovn-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.252885 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovn-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: E1125 16:09:03.252898 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovn-acl-logging" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.252904 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovn-acl-logging" Nov 25 16:09:03 crc kubenswrapper[4743]: E1125 16:09:03.252913 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.252920 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: E1125 16:09:03.252926 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="nbdb" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.252933 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="nbdb" Nov 25 16:09:03 crc kubenswrapper[4743]: E1125 16:09:03.252944 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="sbdb" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.252950 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="sbdb" Nov 25 16:09:03 crc kubenswrapper[4743]: E1125 16:09:03.252958 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="northd" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.252965 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="northd" Nov 25 16:09:03 crc kubenswrapper[4743]: E1125 16:09:03.252974 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="kube-rbac-proxy-ovn-metrics" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.252980 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="kube-rbac-proxy-ovn-metrics" Nov 25 16:09:03 crc kubenswrapper[4743]: E1125 16:09:03.252988 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.252994 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: E1125 16:09:03.253002 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="kube-rbac-proxy-node" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253030 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="kube-rbac-proxy-node" Nov 25 16:09:03 crc kubenswrapper[4743]: E1125 16:09:03.253044 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="kubecfg-setup" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253050 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="kubecfg-setup" Nov 25 16:09:03 crc kubenswrapper[4743]: E1125 16:09:03.253056 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253062 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253147 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="northd" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253156 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="nbdb" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253166 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="sbdb" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253173 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253182 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovn-acl-logging" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253190 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253198 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253206 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="kube-rbac-proxy-node" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253216 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovn-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253224 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253232 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253238 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="kube-rbac-proxy-ovn-metrics" Nov 25 16:09:03 crc kubenswrapper[4743]: E1125 16:09:03.253328 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253336 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: E1125 16:09:03.253347 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.253353 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerName="ovnkube-controller" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.254916 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255169 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-env-overrides\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255210 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-cni-bin\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255230 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-systemd-units\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255272 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-node-log\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255282 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255294 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbbps\" (UniqueName: \"kubernetes.io/projected/d04400c3-4f05-4be2-b759-a60cec0746ec-kube-api-access-lbbps\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255317 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-node-log" (OuterVolumeSpecName: "node-log") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255330 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255339 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255359 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d04400c3-4f05-4be2-b759-a60cec0746ec-ovn-node-metrics-cert\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255443 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255459 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-openvswitch\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255483 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255499 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-log-socket\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255536 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-etc-openvswitch\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255568 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-ovn\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255602 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-kubelet\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255585 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255624 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-log-socket" (OuterVolumeSpecName: "log-socket") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255635 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-ovnkube-config\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255639 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255637 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255659 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255662 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-ovnkube-script-lib\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255713 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-run-ovn-kubernetes\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255739 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-cni-netd\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255768 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-slash\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255789 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-run-netns\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255809 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-var-lib-openvswitch\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255837 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-systemd\") pod \"d04400c3-4f05-4be2-b759-a60cec0746ec\" (UID: \"d04400c3-4f05-4be2-b759-a60cec0746ec\") " Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255909 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-slash" (OuterVolumeSpecName: "host-slash") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255917 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255952 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255955 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255973 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.255982 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256047 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256485 4743 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256507 4743 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256518 4743 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-slash\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256545 4743 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256615 4743 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256642 4743 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256656 4743 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256666 4743 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256676 4743 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-node-log\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256687 4743 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256699 4743 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256710 4743 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-log-socket\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256721 4743 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256731 4743 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256741 4743 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256751 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.256761 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d04400c3-4f05-4be2-b759-a60cec0746ec-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.260793 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d04400c3-4f05-4be2-b759-a60cec0746ec-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.260879 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d04400c3-4f05-4be2-b759-a60cec0746ec-kube-api-access-lbbps" (OuterVolumeSpecName: "kube-api-access-lbbps") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "kube-api-access-lbbps". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.268177 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d04400c3-4f05-4be2-b759-a60cec0746ec" (UID: "d04400c3-4f05-4be2-b759-a60cec0746ec"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357036 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-etc-openvswitch\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357082 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-var-lib-openvswitch\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357108 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-log-socket\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357151 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-kubelet\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357169 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9bda52b9-5f03-4dee-8ae7-cc04f138c227-env-overrides\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357191 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qm6h\" (UniqueName: \"kubernetes.io/projected/9bda52b9-5f03-4dee-8ae7-cc04f138c227-kube-api-access-9qm6h\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357230 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9bda52b9-5f03-4dee-8ae7-cc04f138c227-ovnkube-script-lib\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357254 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-run-systemd\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357274 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-slash\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357293 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357310 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-node-log\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357323 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-run-openvswitch\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357353 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9bda52b9-5f03-4dee-8ae7-cc04f138c227-ovnkube-config\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357368 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-run-ovn\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357388 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-cni-bin\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357423 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bda52b9-5f03-4dee-8ae7-cc04f138c227-ovn-node-metrics-cert\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357440 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-systemd-units\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357547 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-run-ovn-kubernetes\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357650 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-run-netns\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357674 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-cni-netd\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357710 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d04400c3-4f05-4be2-b759-a60cec0746ec-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357722 4743 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d04400c3-4f05-4be2-b759-a60cec0746ec-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.357731 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbbps\" (UniqueName: \"kubernetes.io/projected/d04400c3-4f05-4be2-b759-a60cec0746ec-kube-api-access-lbbps\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.459265 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qm6h\" (UniqueName: \"kubernetes.io/projected/9bda52b9-5f03-4dee-8ae7-cc04f138c227-kube-api-access-9qm6h\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.459557 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9bda52b9-5f03-4dee-8ae7-cc04f138c227-ovnkube-script-lib\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.459678 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-run-systemd\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.459738 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-run-systemd\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.459770 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-slash\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.459870 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-slash\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.459908 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.459984 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-node-log\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460014 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-run-openvswitch\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460039 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9bda52b9-5f03-4dee-8ae7-cc04f138c227-ovnkube-config\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460068 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-run-ovn\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460104 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-run-openvswitch\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460120 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-cni-bin\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460150 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-node-log\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460156 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bda52b9-5f03-4dee-8ae7-cc04f138c227-ovn-node-metrics-cert\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460189 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-cni-bin\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460194 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-systemd-units\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460203 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-run-ovn\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460227 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-run-ovn-kubernetes\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460218 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-systemd-units\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460279 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-run-ovn-kubernetes\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460306 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-run-netns\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460360 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-cni-netd\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460376 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9bda52b9-5f03-4dee-8ae7-cc04f138c227-ovnkube-script-lib\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460391 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-run-netns\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460398 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-etc-openvswitch\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460429 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-cni-netd\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460435 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-var-lib-openvswitch\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460467 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-log-socket\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460479 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-etc-openvswitch\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460522 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-var-lib-openvswitch\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460533 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-kubelet\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460559 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-log-socket\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460579 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9bda52b9-5f03-4dee-8ae7-cc04f138c227-env-overrides\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460637 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-kubelet\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460713 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9bda52b9-5f03-4dee-8ae7-cc04f138c227-ovnkube-config\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.460849 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9bda52b9-5f03-4dee-8ae7-cc04f138c227-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.461078 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9bda52b9-5f03-4dee-8ae7-cc04f138c227-env-overrides\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.462698 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9bda52b9-5f03-4dee-8ae7-cc04f138c227-ovn-node-metrics-cert\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.481790 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qm6h\" (UniqueName: \"kubernetes.io/projected/9bda52b9-5f03-4dee-8ae7-cc04f138c227-kube-api-access-9qm6h\") pod \"ovnkube-node-w4v9c\" (UID: \"9bda52b9-5f03-4dee-8ae7-cc04f138c227\") " pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:03 crc kubenswrapper[4743]: I1125 16:09:03.572204 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.040269 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovn-acl-logging/0.log" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.040781 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbbjc_d04400c3-4f05-4be2-b759-a60cec0746ec/ovn-controller/0.log" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.041147 4743 generic.go:334] "Generic (PLEG): container finished" podID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerID="1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d" exitCode=0 Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.041170 4743 generic.go:334] "Generic (PLEG): container finished" podID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerID="42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560" exitCode=0 Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.041181 4743 generic.go:334] "Generic (PLEG): container finished" podID="d04400c3-4f05-4be2-b759-a60cec0746ec" containerID="328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45" exitCode=0 Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.041237 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.041241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerDied","Data":"1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d"} Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.041345 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerDied","Data":"42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560"} Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.041360 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerDied","Data":"328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45"} Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.041373 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbbjc" event={"ID":"d04400c3-4f05-4be2-b759-a60cec0746ec","Type":"ContainerDied","Data":"7705045cc01c8d9ca1344b80a19bf76e3c1931e5a42002654041baa614c7e38a"} Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.041393 4743 scope.go:117] "RemoveContainer" containerID="5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.042635 4743 generic.go:334] "Generic (PLEG): container finished" podID="9bda52b9-5f03-4dee-8ae7-cc04f138c227" containerID="5fdae726620536b5b13d2b43044e39b0fca68cd98bba6437b803b0902ea6bc2c" exitCode=0 Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.042701 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" event={"ID":"9bda52b9-5f03-4dee-8ae7-cc04f138c227","Type":"ContainerDied","Data":"5fdae726620536b5b13d2b43044e39b0fca68cd98bba6437b803b0902ea6bc2c"} Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.042730 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" event={"ID":"9bda52b9-5f03-4dee-8ae7-cc04f138c227","Type":"ContainerStarted","Data":"d85219c19c82b841f79a737a75bcd19c272409426d6138b8c3498469ea1bf1f4"} Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.044159 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n2r2l_2175b34c-5202-4e94-af0e-2f879b98c0bc/kube-multus/2.log" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.062420 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pbbjc"] Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.069423 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pbbjc"] Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.073776 4743 scope.go:117] "RemoveContainer" containerID="1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.105927 4743 scope.go:117] "RemoveContainer" containerID="42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.123649 4743 scope.go:117] "RemoveContainer" containerID="328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.147413 4743 scope.go:117] "RemoveContainer" containerID="0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.161466 4743 scope.go:117] "RemoveContainer" containerID="0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.178648 4743 scope.go:117] "RemoveContainer" containerID="42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.192646 4743 scope.go:117] "RemoveContainer" containerID="a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.216318 4743 scope.go:117] "RemoveContainer" containerID="40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.247738 4743 scope.go:117] "RemoveContainer" containerID="5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9" Nov 25 16:09:04 crc kubenswrapper[4743]: E1125 16:09:04.248235 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9\": container with ID starting with 5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9 not found: ID does not exist" containerID="5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.248286 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9"} err="failed to get container status \"5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9\": rpc error: code = NotFound desc = could not find container \"5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9\": container with ID starting with 5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.248322 4743 scope.go:117] "RemoveContainer" containerID="1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d" Nov 25 16:09:04 crc kubenswrapper[4743]: E1125 16:09:04.249187 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\": container with ID starting with 1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d not found: ID does not exist" containerID="1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.249261 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d"} err="failed to get container status \"1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\": rpc error: code = NotFound desc = could not find container \"1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\": container with ID starting with 1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.249326 4743 scope.go:117] "RemoveContainer" containerID="42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560" Nov 25 16:09:04 crc kubenswrapper[4743]: E1125 16:09:04.249732 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\": container with ID starting with 42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560 not found: ID does not exist" containerID="42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.249778 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560"} err="failed to get container status \"42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\": rpc error: code = NotFound desc = could not find container \"42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\": container with ID starting with 42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.249812 4743 scope.go:117] "RemoveContainer" containerID="328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45" Nov 25 16:09:04 crc kubenswrapper[4743]: E1125 16:09:04.250061 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\": container with ID starting with 328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45 not found: ID does not exist" containerID="328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.250090 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45"} err="failed to get container status \"328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\": rpc error: code = NotFound desc = could not find container \"328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\": container with ID starting with 328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.250110 4743 scope.go:117] "RemoveContainer" containerID="0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f" Nov 25 16:09:04 crc kubenswrapper[4743]: E1125 16:09:04.250352 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\": container with ID starting with 0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f not found: ID does not exist" containerID="0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.250380 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f"} err="failed to get container status \"0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\": rpc error: code = NotFound desc = could not find container \"0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\": container with ID starting with 0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.250397 4743 scope.go:117] "RemoveContainer" containerID="0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669" Nov 25 16:09:04 crc kubenswrapper[4743]: E1125 16:09:04.250661 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\": container with ID starting with 0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669 not found: ID does not exist" containerID="0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.250691 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669"} err="failed to get container status \"0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\": rpc error: code = NotFound desc = could not find container \"0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\": container with ID starting with 0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.250709 4743 scope.go:117] "RemoveContainer" containerID="42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8" Nov 25 16:09:04 crc kubenswrapper[4743]: E1125 16:09:04.250957 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\": container with ID starting with 42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8 not found: ID does not exist" containerID="42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.250986 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8"} err="failed to get container status \"42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\": rpc error: code = NotFound desc = could not find container \"42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\": container with ID starting with 42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.251002 4743 scope.go:117] "RemoveContainer" containerID="a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d" Nov 25 16:09:04 crc kubenswrapper[4743]: E1125 16:09:04.251187 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\": container with ID starting with a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d not found: ID does not exist" containerID="a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.251214 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d"} err="failed to get container status \"a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\": rpc error: code = NotFound desc = could not find container \"a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\": container with ID starting with a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.251229 4743 scope.go:117] "RemoveContainer" containerID="40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314" Nov 25 16:09:04 crc kubenswrapper[4743]: E1125 16:09:04.251448 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\": container with ID starting with 40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314 not found: ID does not exist" containerID="40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.251479 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314"} err="failed to get container status \"40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\": rpc error: code = NotFound desc = could not find container \"40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\": container with ID starting with 40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.251498 4743 scope.go:117] "RemoveContainer" containerID="5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.251734 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9"} err="failed to get container status \"5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9\": rpc error: code = NotFound desc = could not find container \"5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9\": container with ID starting with 5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.251755 4743 scope.go:117] "RemoveContainer" containerID="1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.251971 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d"} err="failed to get container status \"1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\": rpc error: code = NotFound desc = could not find container \"1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\": container with ID starting with 1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.251993 4743 scope.go:117] "RemoveContainer" containerID="42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.252185 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560"} err="failed to get container status \"42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\": rpc error: code = NotFound desc = could not find container \"42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\": container with ID starting with 42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.252206 4743 scope.go:117] "RemoveContainer" containerID="328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.252433 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45"} err="failed to get container status \"328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\": rpc error: code = NotFound desc = could not find container \"328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\": container with ID starting with 328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.252457 4743 scope.go:117] "RemoveContainer" containerID="0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.252800 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f"} err="failed to get container status \"0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\": rpc error: code = NotFound desc = could not find container \"0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\": container with ID starting with 0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.252824 4743 scope.go:117] "RemoveContainer" containerID="0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.253023 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669"} err="failed to get container status \"0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\": rpc error: code = NotFound desc = could not find container \"0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\": container with ID starting with 0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.253050 4743 scope.go:117] "RemoveContainer" containerID="42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.253250 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8"} err="failed to get container status \"42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\": rpc error: code = NotFound desc = could not find container \"42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\": container with ID starting with 42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.253270 4743 scope.go:117] "RemoveContainer" containerID="a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.253448 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d"} err="failed to get container status \"a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\": rpc error: code = NotFound desc = could not find container \"a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\": container with ID starting with a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.253467 4743 scope.go:117] "RemoveContainer" containerID="40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.253657 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314"} err="failed to get container status \"40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\": rpc error: code = NotFound desc = could not find container \"40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\": container with ID starting with 40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.253678 4743 scope.go:117] "RemoveContainer" containerID="5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.253902 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9"} err="failed to get container status \"5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9\": rpc error: code = NotFound desc = could not find container \"5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9\": container with ID starting with 5c791bfc7095b38b99de8c5d1ca74027848aa84be890e3cb636fd001a8eaa3f9 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.253931 4743 scope.go:117] "RemoveContainer" containerID="1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.254139 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d"} err="failed to get container status \"1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\": rpc error: code = NotFound desc = could not find container \"1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d\": container with ID starting with 1bba3887faa4ecae030bb3df20aaca9395dcb8b55a62906c7b51c417f093623d not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.254164 4743 scope.go:117] "RemoveContainer" containerID="42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.254356 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560"} err="failed to get container status \"42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\": rpc error: code = NotFound desc = could not find container \"42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560\": container with ID starting with 42cd125c015bdf8452fc2d0cd856d8b6c91d79c341c81b73895acb236ee93560 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.254375 4743 scope.go:117] "RemoveContainer" containerID="328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.254558 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45"} err="failed to get container status \"328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\": rpc error: code = NotFound desc = could not find container \"328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45\": container with ID starting with 328651ae66201d81540d6e5697cc2b19d6c791b8089aa57f2d547ad061d40b45 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.254578 4743 scope.go:117] "RemoveContainer" containerID="0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.254795 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f"} err="failed to get container status \"0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\": rpc error: code = NotFound desc = could not find container \"0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f\": container with ID starting with 0fd965bc3302e3b5f316b197efb55218f1dc57087fe65bc1a58d7e279c32103f not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.254832 4743 scope.go:117] "RemoveContainer" containerID="0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.255019 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669"} err="failed to get container status \"0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\": rpc error: code = NotFound desc = could not find container \"0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669\": container with ID starting with 0c04dd84401b01e2e198b88c796c05d00f070d3d6c5ef7814c1c74e66f615669 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.255043 4743 scope.go:117] "RemoveContainer" containerID="42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.255255 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8"} err="failed to get container status \"42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\": rpc error: code = NotFound desc = could not find container \"42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8\": container with ID starting with 42380380626b56f7d10622f1e99204f0082445dc96dae3942902e3262c649ab8 not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.255278 4743 scope.go:117] "RemoveContainer" containerID="a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.257814 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d"} err="failed to get container status \"a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\": rpc error: code = NotFound desc = could not find container \"a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d\": container with ID starting with a3721b4dcbf05d1ce1afb51796b7cb31a1cfaee4dcc4eb11f3100a3a1d6d567d not found: ID does not exist" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.257863 4743 scope.go:117] "RemoveContainer" containerID="40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314" Nov 25 16:09:04 crc kubenswrapper[4743]: I1125 16:09:04.258418 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314"} err="failed to get container status \"40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\": rpc error: code = NotFound desc = could not find container \"40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314\": container with ID starting with 40edd80407d3a46c03aaf0a8b7c1d852501473b784942d71014b02256fe6b314 not found: ID does not exist" Nov 25 16:09:05 crc kubenswrapper[4743]: I1125 16:09:05.056280 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" event={"ID":"9bda52b9-5f03-4dee-8ae7-cc04f138c227","Type":"ContainerStarted","Data":"c87c927ba35189bf43ad778d6d5d8cba7596d085c11abfde6037a851d9ac4f8f"} Nov 25 16:09:05 crc kubenswrapper[4743]: I1125 16:09:05.056792 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" event={"ID":"9bda52b9-5f03-4dee-8ae7-cc04f138c227","Type":"ContainerStarted","Data":"ef4dfbdd4145caf7c3daa197e9cf61994cba90b61dec4119f6e2ee5d46b2041c"} Nov 25 16:09:05 crc kubenswrapper[4743]: I1125 16:09:05.056807 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" event={"ID":"9bda52b9-5f03-4dee-8ae7-cc04f138c227","Type":"ContainerStarted","Data":"15f20feb19a63c2560791dab5eb2875ca1caed48c4d0494c3f6242c538ddd3fa"} Nov 25 16:09:05 crc kubenswrapper[4743]: I1125 16:09:05.056820 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" event={"ID":"9bda52b9-5f03-4dee-8ae7-cc04f138c227","Type":"ContainerStarted","Data":"0f6de15b5a769a70594a8c01f8490c56f091304f3d2969700e0bb1b22238d72d"} Nov 25 16:09:05 crc kubenswrapper[4743]: I1125 16:09:05.056831 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" event={"ID":"9bda52b9-5f03-4dee-8ae7-cc04f138c227","Type":"ContainerStarted","Data":"f9f0b737bab6200566415c5cfd83b82326187e3e3f127873ca73ec1e19067ad5"} Nov 25 16:09:05 crc kubenswrapper[4743]: I1125 16:09:05.056843 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" event={"ID":"9bda52b9-5f03-4dee-8ae7-cc04f138c227","Type":"ContainerStarted","Data":"580f4f08b33483c752c8acfe8b349b1a4c206cf701c88cc522a4b89e3190983e"} Nov 25 16:09:05 crc kubenswrapper[4743]: I1125 16:09:05.784352 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d04400c3-4f05-4be2-b759-a60cec0746ec" path="/var/lib/kubelet/pods/d04400c3-4f05-4be2-b759-a60cec0746ec/volumes" Nov 25 16:09:07 crc kubenswrapper[4743]: I1125 16:09:07.072694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" event={"ID":"9bda52b9-5f03-4dee-8ae7-cc04f138c227","Type":"ContainerStarted","Data":"8c82114800c46c21b637dc9f5bc85da1166500375bea5583d5d32117d275f68f"} Nov 25 16:09:10 crc kubenswrapper[4743]: I1125 16:09:10.090985 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" event={"ID":"9bda52b9-5f03-4dee-8ae7-cc04f138c227","Type":"ContainerStarted","Data":"f78c460543da8c9a36b10e0a8dd353193982fe085be021219024c83e153b6282"} Nov 25 16:09:10 crc kubenswrapper[4743]: I1125 16:09:10.092333 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:10 crc kubenswrapper[4743]: I1125 16:09:10.092416 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:10 crc kubenswrapper[4743]: I1125 16:09:10.120845 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:10 crc kubenswrapper[4743]: I1125 16:09:10.123653 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" podStartSLOduration=7.123634266 podStartE2EDuration="7.123634266s" podCreationTimestamp="2025-11-25 16:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:09:10.12083534 +0000 UTC m=+629.242674909" watchObservedRunningTime="2025-11-25 16:09:10.123634266 +0000 UTC m=+629.245473825" Nov 25 16:09:11 crc kubenswrapper[4743]: I1125 16:09:11.095516 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:11 crc kubenswrapper[4743]: I1125 16:09:11.121322 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:14 crc kubenswrapper[4743]: I1125 16:09:14.775072 4743 scope.go:117] "RemoveContainer" containerID="3d369a5a2f039f3e389943653b6aee59a909ef60ee893769d64b9c6016f61900" Nov 25 16:09:14 crc kubenswrapper[4743]: E1125 16:09:14.775968 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-n2r2l_openshift-multus(2175b34c-5202-4e94-af0e-2f879b98c0bc)\"" pod="openshift-multus/multus-n2r2l" podUID="2175b34c-5202-4e94-af0e-2f879b98c0bc" Nov 25 16:09:29 crc kubenswrapper[4743]: I1125 16:09:29.775009 4743 scope.go:117] "RemoveContainer" containerID="3d369a5a2f039f3e389943653b6aee59a909ef60ee893769d64b9c6016f61900" Nov 25 16:09:30 crc kubenswrapper[4743]: I1125 16:09:30.208003 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n2r2l_2175b34c-5202-4e94-af0e-2f879b98c0bc/kube-multus/2.log" Nov 25 16:09:30 crc kubenswrapper[4743]: I1125 16:09:30.208223 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n2r2l" event={"ID":"2175b34c-5202-4e94-af0e-2f879b98c0bc","Type":"ContainerStarted","Data":"d7f64d5adc38eeb830d49949897fc17c85e935730e5f7742def510c25c158fa1"} Nov 25 16:09:33 crc kubenswrapper[4743]: I1125 16:09:33.598397 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w4v9c" Nov 25 16:09:45 crc kubenswrapper[4743]: I1125 16:09:45.015980 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt"] Nov 25 16:09:45 crc kubenswrapper[4743]: I1125 16:09:45.017397 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" Nov 25 16:09:45 crc kubenswrapper[4743]: I1125 16:09:45.019901 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 25 16:09:45 crc kubenswrapper[4743]: I1125 16:09:45.026562 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt"] Nov 25 16:09:45 crc kubenswrapper[4743]: I1125 16:09:45.081186 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb2ss\" (UniqueName: \"kubernetes.io/projected/8f0d30a7-3fb8-4595-a303-9490f5a78667-kube-api-access-cb2ss\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt\" (UID: \"8f0d30a7-3fb8-4595-a303-9490f5a78667\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" Nov 25 16:09:45 crc kubenswrapper[4743]: I1125 16:09:45.081256 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f0d30a7-3fb8-4595-a303-9490f5a78667-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt\" (UID: \"8f0d30a7-3fb8-4595-a303-9490f5a78667\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" Nov 25 16:09:45 crc kubenswrapper[4743]: I1125 16:09:45.081314 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f0d30a7-3fb8-4595-a303-9490f5a78667-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt\" (UID: \"8f0d30a7-3fb8-4595-a303-9490f5a78667\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" Nov 25 16:09:45 crc kubenswrapper[4743]: I1125 16:09:45.182103 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb2ss\" (UniqueName: \"kubernetes.io/projected/8f0d30a7-3fb8-4595-a303-9490f5a78667-kube-api-access-cb2ss\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt\" (UID: \"8f0d30a7-3fb8-4595-a303-9490f5a78667\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" Nov 25 16:09:45 crc kubenswrapper[4743]: I1125 16:09:45.182157 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f0d30a7-3fb8-4595-a303-9490f5a78667-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt\" (UID: \"8f0d30a7-3fb8-4595-a303-9490f5a78667\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" Nov 25 16:09:45 crc kubenswrapper[4743]: I1125 16:09:45.182206 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f0d30a7-3fb8-4595-a303-9490f5a78667-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt\" (UID: \"8f0d30a7-3fb8-4595-a303-9490f5a78667\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" Nov 25 16:09:45 crc kubenswrapper[4743]: I1125 16:09:45.183063 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f0d30a7-3fb8-4595-a303-9490f5a78667-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt\" (UID: \"8f0d30a7-3fb8-4595-a303-9490f5a78667\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" Nov 25 16:09:45 crc kubenswrapper[4743]: I1125 16:09:45.183097 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f0d30a7-3fb8-4595-a303-9490f5a78667-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt\" (UID: \"8f0d30a7-3fb8-4595-a303-9490f5a78667\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" Nov 25 16:09:45 crc kubenswrapper[4743]: I1125 16:09:45.211779 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb2ss\" (UniqueName: \"kubernetes.io/projected/8f0d30a7-3fb8-4595-a303-9490f5a78667-kube-api-access-cb2ss\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt\" (UID: \"8f0d30a7-3fb8-4595-a303-9490f5a78667\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" Nov 25 16:09:45 crc kubenswrapper[4743]: I1125 16:09:45.336047 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" Nov 25 16:09:45 crc kubenswrapper[4743]: I1125 16:09:45.544114 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt"] Nov 25 16:09:45 crc kubenswrapper[4743]: W1125 16:09:45.548477 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f0d30a7_3fb8_4595_a303_9490f5a78667.slice/crio-20785f4ae8f1953c379b29abd471bded2f90ea7981df3b2467a3289b30a88037 WatchSource:0}: Error finding container 20785f4ae8f1953c379b29abd471bded2f90ea7981df3b2467a3289b30a88037: Status 404 returned error can't find the container with id 20785f4ae8f1953c379b29abd471bded2f90ea7981df3b2467a3289b30a88037 Nov 25 16:09:46 crc kubenswrapper[4743]: I1125 16:09:46.291250 4743 generic.go:334] "Generic (PLEG): container finished" podID="8f0d30a7-3fb8-4595-a303-9490f5a78667" containerID="5359c0d8156accd73c68b4d73a628b532c91c5e2dffec9b03f9c2427056cd467" exitCode=0 Nov 25 16:09:46 crc kubenswrapper[4743]: I1125 16:09:46.291296 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" event={"ID":"8f0d30a7-3fb8-4595-a303-9490f5a78667","Type":"ContainerDied","Data":"5359c0d8156accd73c68b4d73a628b532c91c5e2dffec9b03f9c2427056cd467"} Nov 25 16:09:46 crc kubenswrapper[4743]: I1125 16:09:46.291369 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" event={"ID":"8f0d30a7-3fb8-4595-a303-9490f5a78667","Type":"ContainerStarted","Data":"20785f4ae8f1953c379b29abd471bded2f90ea7981df3b2467a3289b30a88037"} Nov 25 16:09:48 crc kubenswrapper[4743]: I1125 16:09:48.304893 4743 generic.go:334] "Generic (PLEG): container finished" podID="8f0d30a7-3fb8-4595-a303-9490f5a78667" containerID="50859c936b78f2e293499b04523f82814f0bacb693272bd3948ece33c83a4e88" exitCode=0 Nov 25 16:09:48 crc kubenswrapper[4743]: I1125 16:09:48.304952 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" event={"ID":"8f0d30a7-3fb8-4595-a303-9490f5a78667","Type":"ContainerDied","Data":"50859c936b78f2e293499b04523f82814f0bacb693272bd3948ece33c83a4e88"} Nov 25 16:09:49 crc kubenswrapper[4743]: I1125 16:09:49.318272 4743 generic.go:334] "Generic (PLEG): container finished" podID="8f0d30a7-3fb8-4595-a303-9490f5a78667" containerID="e9a9b6d3ddf23362f482c6e3a616ea19fa3d875f90b112f297b56812ed25c254" exitCode=0 Nov 25 16:09:49 crc kubenswrapper[4743]: I1125 16:09:49.318336 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" event={"ID":"8f0d30a7-3fb8-4595-a303-9490f5a78667","Type":"ContainerDied","Data":"e9a9b6d3ddf23362f482c6e3a616ea19fa3d875f90b112f297b56812ed25c254"} Nov 25 16:09:50 crc kubenswrapper[4743]: I1125 16:09:50.563186 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" Nov 25 16:09:50 crc kubenswrapper[4743]: I1125 16:09:50.652331 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f0d30a7-3fb8-4595-a303-9490f5a78667-bundle\") pod \"8f0d30a7-3fb8-4595-a303-9490f5a78667\" (UID: \"8f0d30a7-3fb8-4595-a303-9490f5a78667\") " Nov 25 16:09:50 crc kubenswrapper[4743]: I1125 16:09:50.652411 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb2ss\" (UniqueName: \"kubernetes.io/projected/8f0d30a7-3fb8-4595-a303-9490f5a78667-kube-api-access-cb2ss\") pod \"8f0d30a7-3fb8-4595-a303-9490f5a78667\" (UID: \"8f0d30a7-3fb8-4595-a303-9490f5a78667\") " Nov 25 16:09:50 crc kubenswrapper[4743]: I1125 16:09:50.652500 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f0d30a7-3fb8-4595-a303-9490f5a78667-util\") pod \"8f0d30a7-3fb8-4595-a303-9490f5a78667\" (UID: \"8f0d30a7-3fb8-4595-a303-9490f5a78667\") " Nov 25 16:09:50 crc kubenswrapper[4743]: I1125 16:09:50.653305 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0d30a7-3fb8-4595-a303-9490f5a78667-bundle" (OuterVolumeSpecName: "bundle") pod "8f0d30a7-3fb8-4595-a303-9490f5a78667" (UID: "8f0d30a7-3fb8-4595-a303-9490f5a78667"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:09:50 crc kubenswrapper[4743]: I1125 16:09:50.659365 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0d30a7-3fb8-4595-a303-9490f5a78667-kube-api-access-cb2ss" (OuterVolumeSpecName: "kube-api-access-cb2ss") pod "8f0d30a7-3fb8-4595-a303-9490f5a78667" (UID: "8f0d30a7-3fb8-4595-a303-9490f5a78667"). InnerVolumeSpecName "kube-api-access-cb2ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:09:50 crc kubenswrapper[4743]: I1125 16:09:50.666129 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0d30a7-3fb8-4595-a303-9490f5a78667-util" (OuterVolumeSpecName: "util") pod "8f0d30a7-3fb8-4595-a303-9490f5a78667" (UID: "8f0d30a7-3fb8-4595-a303-9490f5a78667"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:09:50 crc kubenswrapper[4743]: I1125 16:09:50.754168 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f0d30a7-3fb8-4595-a303-9490f5a78667-util\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:50 crc kubenswrapper[4743]: I1125 16:09:50.754209 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f0d30a7-3fb8-4595-a303-9490f5a78667-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:50 crc kubenswrapper[4743]: I1125 16:09:50.754222 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb2ss\" (UniqueName: \"kubernetes.io/projected/8f0d30a7-3fb8-4595-a303-9490f5a78667-kube-api-access-cb2ss\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:51 crc kubenswrapper[4743]: I1125 16:09:51.333164 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" event={"ID":"8f0d30a7-3fb8-4595-a303-9490f5a78667","Type":"ContainerDied","Data":"20785f4ae8f1953c379b29abd471bded2f90ea7981df3b2467a3289b30a88037"} Nov 25 16:09:51 crc kubenswrapper[4743]: I1125 16:09:51.333209 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20785f4ae8f1953c379b29abd471bded2f90ea7981df3b2467a3289b30a88037" Nov 25 16:09:51 crc kubenswrapper[4743]: I1125 16:09:51.333273 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt" Nov 25 16:09:52 crc kubenswrapper[4743]: I1125 16:09:52.761241 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-6mtvn"] Nov 25 16:09:52 crc kubenswrapper[4743]: E1125 16:09:52.761778 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0d30a7-3fb8-4595-a303-9490f5a78667" containerName="util" Nov 25 16:09:52 crc kubenswrapper[4743]: I1125 16:09:52.761790 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0d30a7-3fb8-4595-a303-9490f5a78667" containerName="util" Nov 25 16:09:52 crc kubenswrapper[4743]: E1125 16:09:52.761803 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0d30a7-3fb8-4595-a303-9490f5a78667" containerName="pull" Nov 25 16:09:52 crc kubenswrapper[4743]: I1125 16:09:52.761808 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0d30a7-3fb8-4595-a303-9490f5a78667" containerName="pull" Nov 25 16:09:52 crc kubenswrapper[4743]: E1125 16:09:52.761817 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0d30a7-3fb8-4595-a303-9490f5a78667" containerName="extract" Nov 25 16:09:52 crc kubenswrapper[4743]: I1125 16:09:52.761824 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0d30a7-3fb8-4595-a303-9490f5a78667" containerName="extract" Nov 25 16:09:52 crc kubenswrapper[4743]: I1125 16:09:52.761921 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0d30a7-3fb8-4595-a303-9490f5a78667" containerName="extract" Nov 25 16:09:52 crc kubenswrapper[4743]: I1125 16:09:52.762310 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-6mtvn" Nov 25 16:09:52 crc kubenswrapper[4743]: I1125 16:09:52.765757 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 25 16:09:52 crc kubenswrapper[4743]: I1125 16:09:52.765848 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-twvzm" Nov 25 16:09:52 crc kubenswrapper[4743]: I1125 16:09:52.765769 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 25 16:09:52 crc kubenswrapper[4743]: I1125 16:09:52.771171 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-6mtvn"] Nov 25 16:09:52 crc kubenswrapper[4743]: I1125 16:09:52.780816 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p7bz\" (UniqueName: \"kubernetes.io/projected/a3c55fbc-7f41-4fe1-b7cf-9b5476c4c1ae-kube-api-access-6p7bz\") pod \"nmstate-operator-557fdffb88-6mtvn\" (UID: \"a3c55fbc-7f41-4fe1-b7cf-9b5476c4c1ae\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-6mtvn" Nov 25 16:09:52 crc kubenswrapper[4743]: I1125 16:09:52.881904 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p7bz\" (UniqueName: \"kubernetes.io/projected/a3c55fbc-7f41-4fe1-b7cf-9b5476c4c1ae-kube-api-access-6p7bz\") pod \"nmstate-operator-557fdffb88-6mtvn\" (UID: \"a3c55fbc-7f41-4fe1-b7cf-9b5476c4c1ae\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-6mtvn" Nov 25 16:09:52 crc kubenswrapper[4743]: I1125 16:09:52.898365 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p7bz\" (UniqueName: \"kubernetes.io/projected/a3c55fbc-7f41-4fe1-b7cf-9b5476c4c1ae-kube-api-access-6p7bz\") pod \"nmstate-operator-557fdffb88-6mtvn\" (UID: \"a3c55fbc-7f41-4fe1-b7cf-9b5476c4c1ae\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-6mtvn" Nov 25 16:09:53 crc kubenswrapper[4743]: I1125 16:09:53.080320 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-6mtvn" Nov 25 16:09:53 crc kubenswrapper[4743]: I1125 16:09:53.263150 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-6mtvn"] Nov 25 16:09:53 crc kubenswrapper[4743]: W1125 16:09:53.269659 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3c55fbc_7f41_4fe1_b7cf_9b5476c4c1ae.slice/crio-679fd5f0dc60eab1d3c9bdd1ed90281b8fd8b3d756160c1f17658b58c8d2f2e7 WatchSource:0}: Error finding container 679fd5f0dc60eab1d3c9bdd1ed90281b8fd8b3d756160c1f17658b58c8d2f2e7: Status 404 returned error can't find the container with id 679fd5f0dc60eab1d3c9bdd1ed90281b8fd8b3d756160c1f17658b58c8d2f2e7 Nov 25 16:09:53 crc kubenswrapper[4743]: I1125 16:09:53.345483 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-6mtvn" event={"ID":"a3c55fbc-7f41-4fe1-b7cf-9b5476c4c1ae","Type":"ContainerStarted","Data":"679fd5f0dc60eab1d3c9bdd1ed90281b8fd8b3d756160c1f17658b58c8d2f2e7"} Nov 25 16:09:56 crc kubenswrapper[4743]: I1125 16:09:56.361409 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-6mtvn" event={"ID":"a3c55fbc-7f41-4fe1-b7cf-9b5476c4c1ae","Type":"ContainerStarted","Data":"dac9b648f6e3cabb59c325c3d669918352c0afc43c22bf21e92ff4fd1675e9ed"} Nov 25 16:09:56 crc kubenswrapper[4743]: I1125 16:09:56.377574 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-557fdffb88-6mtvn" podStartSLOduration=2.254859772 podStartE2EDuration="4.377555108s" podCreationTimestamp="2025-11-25 16:09:52 +0000 UTC" firstStartedPulling="2025-11-25 16:09:53.273964793 +0000 UTC m=+672.395804342" lastFinishedPulling="2025-11-25 16:09:55.396660129 +0000 UTC m=+674.518499678" observedRunningTime="2025-11-25 16:09:56.373840852 +0000 UTC m=+675.495680401" watchObservedRunningTime="2025-11-25 16:09:56.377555108 +0000 UTC m=+675.499394657" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.419746 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-7q45f"] Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.420561 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-7q45f" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.422261 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-lvqkr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.429538 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2"] Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.430415 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.432189 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.436154 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm6wt\" (UniqueName: \"kubernetes.io/projected/62b25135-7567-4053-ab8a-5df129154693-kube-api-access-qm6wt\") pod \"nmstate-metrics-5dcf9c57c5-7q45f\" (UID: \"62b25135-7567-4053-ab8a-5df129154693\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-7q45f" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.438054 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-7q45f"] Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.459298 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2"] Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.464127 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-6cznj"] Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.465135 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.537465 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/25afedc8-ae76-4c86-aeaa-c739b1458040-ovs-socket\") pod \"nmstate-handler-6cznj\" (UID: \"25afedc8-ae76-4c86-aeaa-c739b1458040\") " pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.537742 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/25afedc8-ae76-4c86-aeaa-c739b1458040-dbus-socket\") pod \"nmstate-handler-6cznj\" (UID: \"25afedc8-ae76-4c86-aeaa-c739b1458040\") " pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.537931 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b423b0b1-b7c2-4a09-a332-cc9c03bfca51-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-clgq2\" (UID: \"b423b0b1-b7c2-4a09-a332-cc9c03bfca51\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.538027 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpwff\" (UniqueName: \"kubernetes.io/projected/b423b0b1-b7c2-4a09-a332-cc9c03bfca51-kube-api-access-mpwff\") pod \"nmstate-webhook-6b89b748d8-clgq2\" (UID: \"b423b0b1-b7c2-4a09-a332-cc9c03bfca51\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.538081 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pclfm\" (UniqueName: \"kubernetes.io/projected/25afedc8-ae76-4c86-aeaa-c739b1458040-kube-api-access-pclfm\") pod \"nmstate-handler-6cznj\" (UID: \"25afedc8-ae76-4c86-aeaa-c739b1458040\") " pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.538166 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm6wt\" (UniqueName: \"kubernetes.io/projected/62b25135-7567-4053-ab8a-5df129154693-kube-api-access-qm6wt\") pod \"nmstate-metrics-5dcf9c57c5-7q45f\" (UID: \"62b25135-7567-4053-ab8a-5df129154693\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-7q45f" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.538428 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/25afedc8-ae76-4c86-aeaa-c739b1458040-nmstate-lock\") pod \"nmstate-handler-6cznj\" (UID: \"25afedc8-ae76-4c86-aeaa-c739b1458040\") " pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.547930 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb"] Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.548646 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.550766 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-g65rj" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.551136 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.551378 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.564067 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb"] Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.576667 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm6wt\" (UniqueName: \"kubernetes.io/projected/62b25135-7567-4053-ab8a-5df129154693-kube-api-access-qm6wt\") pod \"nmstate-metrics-5dcf9c57c5-7q45f\" (UID: \"62b25135-7567-4053-ab8a-5df129154693\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-7q45f" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.639911 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pclfm\" (UniqueName: \"kubernetes.io/projected/25afedc8-ae76-4c86-aeaa-c739b1458040-kube-api-access-pclfm\") pod \"nmstate-handler-6cznj\" (UID: \"25afedc8-ae76-4c86-aeaa-c739b1458040\") " pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.640000 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7hn9\" (UniqueName: \"kubernetes.io/projected/994ed247-8a08-4164-89b4-c03a90c4ef5d-kube-api-access-m7hn9\") pod \"nmstate-console-plugin-5874bd7bc5-zhqdb\" (UID: \"994ed247-8a08-4164-89b4-c03a90c4ef5d\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.640032 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/994ed247-8a08-4164-89b4-c03a90c4ef5d-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-zhqdb\" (UID: \"994ed247-8a08-4164-89b4-c03a90c4ef5d\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.640049 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/25afedc8-ae76-4c86-aeaa-c739b1458040-nmstate-lock\") pod \"nmstate-handler-6cznj\" (UID: \"25afedc8-ae76-4c86-aeaa-c739b1458040\") " pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.640083 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/994ed247-8a08-4164-89b4-c03a90c4ef5d-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-zhqdb\" (UID: \"994ed247-8a08-4164-89b4-c03a90c4ef5d\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.640110 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/25afedc8-ae76-4c86-aeaa-c739b1458040-ovs-socket\") pod \"nmstate-handler-6cznj\" (UID: \"25afedc8-ae76-4c86-aeaa-c739b1458040\") " pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.640127 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/25afedc8-ae76-4c86-aeaa-c739b1458040-dbus-socket\") pod \"nmstate-handler-6cznj\" (UID: \"25afedc8-ae76-4c86-aeaa-c739b1458040\") " pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.640144 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b423b0b1-b7c2-4a09-a332-cc9c03bfca51-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-clgq2\" (UID: \"b423b0b1-b7c2-4a09-a332-cc9c03bfca51\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.640167 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpwff\" (UniqueName: \"kubernetes.io/projected/b423b0b1-b7c2-4a09-a332-cc9c03bfca51-kube-api-access-mpwff\") pod \"nmstate-webhook-6b89b748d8-clgq2\" (UID: \"b423b0b1-b7c2-4a09-a332-cc9c03bfca51\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.640405 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/25afedc8-ae76-4c86-aeaa-c739b1458040-nmstate-lock\") pod \"nmstate-handler-6cznj\" (UID: \"25afedc8-ae76-4c86-aeaa-c739b1458040\") " pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:09:57 crc kubenswrapper[4743]: E1125 16:09:57.640412 4743 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.640449 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/25afedc8-ae76-4c86-aeaa-c739b1458040-ovs-socket\") pod \"nmstate-handler-6cznj\" (UID: \"25afedc8-ae76-4c86-aeaa-c739b1458040\") " pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:09:57 crc kubenswrapper[4743]: E1125 16:09:57.640481 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b423b0b1-b7c2-4a09-a332-cc9c03bfca51-tls-key-pair podName:b423b0b1-b7c2-4a09-a332-cc9c03bfca51 nodeName:}" failed. No retries permitted until 2025-11-25 16:09:58.140461884 +0000 UTC m=+677.262301443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/b423b0b1-b7c2-4a09-a332-cc9c03bfca51-tls-key-pair") pod "nmstate-webhook-6b89b748d8-clgq2" (UID: "b423b0b1-b7c2-4a09-a332-cc9c03bfca51") : secret "openshift-nmstate-webhook" not found Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.641027 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/25afedc8-ae76-4c86-aeaa-c739b1458040-dbus-socket\") pod \"nmstate-handler-6cznj\" (UID: \"25afedc8-ae76-4c86-aeaa-c739b1458040\") " pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.657204 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pclfm\" (UniqueName: \"kubernetes.io/projected/25afedc8-ae76-4c86-aeaa-c739b1458040-kube-api-access-pclfm\") pod \"nmstate-handler-6cznj\" (UID: \"25afedc8-ae76-4c86-aeaa-c739b1458040\") " pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.658061 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpwff\" (UniqueName: \"kubernetes.io/projected/b423b0b1-b7c2-4a09-a332-cc9c03bfca51-kube-api-access-mpwff\") pod \"nmstate-webhook-6b89b748d8-clgq2\" (UID: \"b423b0b1-b7c2-4a09-a332-cc9c03bfca51\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.738280 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-7q45f" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.741002 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7hn9\" (UniqueName: \"kubernetes.io/projected/994ed247-8a08-4164-89b4-c03a90c4ef5d-kube-api-access-m7hn9\") pod \"nmstate-console-plugin-5874bd7bc5-zhqdb\" (UID: \"994ed247-8a08-4164-89b4-c03a90c4ef5d\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.741032 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/994ed247-8a08-4164-89b4-c03a90c4ef5d-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-zhqdb\" (UID: \"994ed247-8a08-4164-89b4-c03a90c4ef5d\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.741064 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/994ed247-8a08-4164-89b4-c03a90c4ef5d-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-zhqdb\" (UID: \"994ed247-8a08-4164-89b4-c03a90c4ef5d\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.741944 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/994ed247-8a08-4164-89b4-c03a90c4ef5d-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-zhqdb\" (UID: \"994ed247-8a08-4164-89b4-c03a90c4ef5d\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.745199 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/994ed247-8a08-4164-89b4-c03a90c4ef5d-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-zhqdb\" (UID: \"994ed247-8a08-4164-89b4-c03a90c4ef5d\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.769022 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-67f9bfdd55-mvfgr"] Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.770502 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.777510 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7hn9\" (UniqueName: \"kubernetes.io/projected/994ed247-8a08-4164-89b4-c03a90c4ef5d-kube-api-access-m7hn9\") pod \"nmstate-console-plugin-5874bd7bc5-zhqdb\" (UID: \"994ed247-8a08-4164-89b4-c03a90c4ef5d\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.785743 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67f9bfdd55-mvfgr"] Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.788159 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:09:57 crc kubenswrapper[4743]: W1125 16:09:57.816287 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25afedc8_ae76_4c86_aeaa_c739b1458040.slice/crio-4f269d1612e4d097f4e94b463e40ae93d24f1b41499724c8f6613480b13ff9b8 WatchSource:0}: Error finding container 4f269d1612e4d097f4e94b463e40ae93d24f1b41499724c8f6613480b13ff9b8: Status 404 returned error can't find the container with id 4f269d1612e4d097f4e94b463e40ae93d24f1b41499724c8f6613480b13ff9b8 Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.842656 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wdrc\" (UniqueName: \"kubernetes.io/projected/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-kube-api-access-2wdrc\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.843013 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-service-ca\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.843036 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-console-oauth-config\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.843083 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-console-serving-cert\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.843146 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-oauth-serving-cert\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.844558 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-trusted-ca-bundle\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.844651 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-console-config\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.864169 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.942241 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-7q45f"] Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.945696 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-oauth-serving-cert\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.945742 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-trusted-ca-bundle\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.945783 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-console-config\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.945819 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wdrc\" (UniqueName: \"kubernetes.io/projected/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-kube-api-access-2wdrc\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.945850 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-service-ca\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.945868 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-console-oauth-config\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.945896 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-console-serving-cert\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.946842 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-oauth-serving-cert\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.948004 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-console-config\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.948215 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-service-ca\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.952023 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-console-oauth-config\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.952035 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-console-serving-cert\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.957176 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-trusted-ca-bundle\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:57 crc kubenswrapper[4743]: I1125 16:09:57.966217 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wdrc\" (UniqueName: \"kubernetes.io/projected/b42b515a-e7b3-4d8b-ab01-ee8c9607fe05-kube-api-access-2wdrc\") pod \"console-67f9bfdd55-mvfgr\" (UID: \"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05\") " pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:58 crc kubenswrapper[4743]: I1125 16:09:58.028791 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb"] Nov 25 16:09:58 crc kubenswrapper[4743]: W1125 16:09:58.030984 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod994ed247_8a08_4164_89b4_c03a90c4ef5d.slice/crio-ffbf29b50d4c8a254a0ce07e2574d44d03018c0458619b5715cbceb3ff8b29c7 WatchSource:0}: Error finding container ffbf29b50d4c8a254a0ce07e2574d44d03018c0458619b5715cbceb3ff8b29c7: Status 404 returned error can't find the container with id ffbf29b50d4c8a254a0ce07e2574d44d03018c0458619b5715cbceb3ff8b29c7 Nov 25 16:09:58 crc kubenswrapper[4743]: I1125 16:09:58.109629 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:09:58 crc kubenswrapper[4743]: I1125 16:09:58.148655 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b423b0b1-b7c2-4a09-a332-cc9c03bfca51-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-clgq2\" (UID: \"b423b0b1-b7c2-4a09-a332-cc9c03bfca51\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2" Nov 25 16:09:58 crc kubenswrapper[4743]: I1125 16:09:58.151806 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b423b0b1-b7c2-4a09-a332-cc9c03bfca51-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-clgq2\" (UID: \"b423b0b1-b7c2-4a09-a332-cc9c03bfca51\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2" Nov 25 16:09:58 crc kubenswrapper[4743]: I1125 16:09:58.313139 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67f9bfdd55-mvfgr"] Nov 25 16:09:58 crc kubenswrapper[4743]: W1125 16:09:58.318219 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb42b515a_e7b3_4d8b_ab01_ee8c9607fe05.slice/crio-4c99ecf39205aa769b7045aad3105129896831d02f3fa364d26da57a9a3bfc4c WatchSource:0}: Error finding container 4c99ecf39205aa769b7045aad3105129896831d02f3fa364d26da57a9a3bfc4c: Status 404 returned error can't find the container with id 4c99ecf39205aa769b7045aad3105129896831d02f3fa364d26da57a9a3bfc4c Nov 25 16:09:58 crc kubenswrapper[4743]: I1125 16:09:58.358867 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2" Nov 25 16:09:58 crc kubenswrapper[4743]: I1125 16:09:58.377097 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb" event={"ID":"994ed247-8a08-4164-89b4-c03a90c4ef5d","Type":"ContainerStarted","Data":"ffbf29b50d4c8a254a0ce07e2574d44d03018c0458619b5715cbceb3ff8b29c7"} Nov 25 16:09:58 crc kubenswrapper[4743]: I1125 16:09:58.379821 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67f9bfdd55-mvfgr" event={"ID":"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05","Type":"ContainerStarted","Data":"4c99ecf39205aa769b7045aad3105129896831d02f3fa364d26da57a9a3bfc4c"} Nov 25 16:09:58 crc kubenswrapper[4743]: I1125 16:09:58.380944 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-7q45f" event={"ID":"62b25135-7567-4053-ab8a-5df129154693","Type":"ContainerStarted","Data":"8f8a703bdac38237c4cdda379a5fc9d138b5bd6707bf88ab523fc46d67d75638"} Nov 25 16:09:58 crc kubenswrapper[4743]: I1125 16:09:58.381929 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6cznj" event={"ID":"25afedc8-ae76-4c86-aeaa-c739b1458040","Type":"ContainerStarted","Data":"4f269d1612e4d097f4e94b463e40ae93d24f1b41499724c8f6613480b13ff9b8"} Nov 25 16:09:58 crc kubenswrapper[4743]: I1125 16:09:58.533813 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2"] Nov 25 16:09:59 crc kubenswrapper[4743]: I1125 16:09:59.389863 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2" event={"ID":"b423b0b1-b7c2-4a09-a332-cc9c03bfca51","Type":"ContainerStarted","Data":"4576741ac48595e7065abc467ef59eea3e2949f0ab65db9dab302be2befaccde"} Nov 25 16:09:59 crc kubenswrapper[4743]: I1125 16:09:59.391281 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67f9bfdd55-mvfgr" event={"ID":"b42b515a-e7b3-4d8b-ab01-ee8c9607fe05","Type":"ContainerStarted","Data":"59064b94580ee9487720d0c9a9325e2a8d8d158a6ca19351fa1256d399178706"} Nov 25 16:09:59 crc kubenswrapper[4743]: I1125 16:09:59.417971 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67f9bfdd55-mvfgr" podStartSLOduration=2.417953216 podStartE2EDuration="2.417953216s" podCreationTimestamp="2025-11-25 16:09:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:09:59.412396963 +0000 UTC m=+678.534236532" watchObservedRunningTime="2025-11-25 16:09:59.417953216 +0000 UTC m=+678.539792765" Nov 25 16:10:01 crc kubenswrapper[4743]: I1125 16:10:01.407362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-7q45f" event={"ID":"62b25135-7567-4053-ab8a-5df129154693","Type":"ContainerStarted","Data":"af394ece2eb03923b184e2f27ef2851b344904850702094245d0f07c2b06b793"} Nov 25 16:10:01 crc kubenswrapper[4743]: I1125 16:10:01.409555 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2" event={"ID":"b423b0b1-b7c2-4a09-a332-cc9c03bfca51","Type":"ContainerStarted","Data":"55fc81804c2f92565e385c2bb2616b497f094cc01d824eb014017da57de07d87"} Nov 25 16:10:01 crc kubenswrapper[4743]: I1125 16:10:01.409689 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2" Nov 25 16:10:01 crc kubenswrapper[4743]: I1125 16:10:01.413352 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6cznj" event={"ID":"25afedc8-ae76-4c86-aeaa-c739b1458040","Type":"ContainerStarted","Data":"0929a90ca74cbef932705282b12f5644e1349b8c71d2b735a228a9142ff15446"} Nov 25 16:10:01 crc kubenswrapper[4743]: I1125 16:10:01.413514 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:10:01 crc kubenswrapper[4743]: I1125 16:10:01.416537 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb" event={"ID":"994ed247-8a08-4164-89b4-c03a90c4ef5d","Type":"ContainerStarted","Data":"64aba8c1ca1673ec1098d96bc51a24e8b2b143f09f579665e52c12125e2dfd5a"} Nov 25 16:10:01 crc kubenswrapper[4743]: I1125 16:10:01.437999 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2" podStartSLOduration=2.367783726 podStartE2EDuration="4.437962376s" podCreationTimestamp="2025-11-25 16:09:57 +0000 UTC" firstStartedPulling="2025-11-25 16:09:58.544221613 +0000 UTC m=+677.666061162" lastFinishedPulling="2025-11-25 16:10:00.614400263 +0000 UTC m=+679.736239812" observedRunningTime="2025-11-25 16:10:01.430536095 +0000 UTC m=+680.552375664" watchObservedRunningTime="2025-11-25 16:10:01.437962376 +0000 UTC m=+680.559801965" Nov 25 16:10:01 crc kubenswrapper[4743]: I1125 16:10:01.453430 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-6cznj" podStartSLOduration=1.687704048 podStartE2EDuration="4.453384235s" podCreationTimestamp="2025-11-25 16:09:57 +0000 UTC" firstStartedPulling="2025-11-25 16:09:57.818380282 +0000 UTC m=+676.940219831" lastFinishedPulling="2025-11-25 16:10:00.584060469 +0000 UTC m=+679.705900018" observedRunningTime="2025-11-25 16:10:01.450178516 +0000 UTC m=+680.572018095" watchObservedRunningTime="2025-11-25 16:10:01.453384235 +0000 UTC m=+680.575223814" Nov 25 16:10:01 crc kubenswrapper[4743]: I1125 16:10:01.468011 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zhqdb" podStartSLOduration=1.917289112 podStartE2EDuration="4.467972749s" podCreationTimestamp="2025-11-25 16:09:57 +0000 UTC" firstStartedPulling="2025-11-25 16:09:58.033364922 +0000 UTC m=+677.155204471" lastFinishedPulling="2025-11-25 16:10:00.584048559 +0000 UTC m=+679.705888108" observedRunningTime="2025-11-25 16:10:01.466909787 +0000 UTC m=+680.588749396" watchObservedRunningTime="2025-11-25 16:10:01.467972749 +0000 UTC m=+680.589812318" Nov 25 16:10:03 crc kubenswrapper[4743]: I1125 16:10:03.427168 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-7q45f" event={"ID":"62b25135-7567-4053-ab8a-5df129154693","Type":"ContainerStarted","Data":"ba4d58aa9d6d477016c24b847a091c2995e21f0d0c293130cc21e584fc76c150"} Nov 25 16:10:03 crc kubenswrapper[4743]: I1125 16:10:03.451641 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-7q45f" podStartSLOduration=1.671166454 podStartE2EDuration="6.451613578s" podCreationTimestamp="2025-11-25 16:09:57 +0000 UTC" firstStartedPulling="2025-11-25 16:09:57.952501886 +0000 UTC m=+677.074341425" lastFinishedPulling="2025-11-25 16:10:02.732949 +0000 UTC m=+681.854788549" observedRunningTime="2025-11-25 16:10:03.447040605 +0000 UTC m=+682.568880214" watchObservedRunningTime="2025-11-25 16:10:03.451613578 +0000 UTC m=+682.573453177" Nov 25 16:10:07 crc kubenswrapper[4743]: I1125 16:10:07.810364 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-6cznj" Nov 25 16:10:08 crc kubenswrapper[4743]: I1125 16:10:08.110196 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:10:08 crc kubenswrapper[4743]: I1125 16:10:08.110271 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:10:08 crc kubenswrapper[4743]: I1125 16:10:08.115251 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:10:08 crc kubenswrapper[4743]: I1125 16:10:08.460271 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67f9bfdd55-mvfgr" Nov 25 16:10:08 crc kubenswrapper[4743]: I1125 16:10:08.509195 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4sghb"] Nov 25 16:10:18 crc kubenswrapper[4743]: I1125 16:10:18.364811 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-clgq2" Nov 25 16:10:20 crc kubenswrapper[4743]: I1125 16:10:20.078120 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:10:20 crc kubenswrapper[4743]: I1125 16:10:20.078197 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:10:30 crc kubenswrapper[4743]: I1125 16:10:30.078061 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j"] Nov 25 16:10:30 crc kubenswrapper[4743]: I1125 16:10:30.080012 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" Nov 25 16:10:30 crc kubenswrapper[4743]: I1125 16:10:30.081607 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 25 16:10:30 crc kubenswrapper[4743]: I1125 16:10:30.087421 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j"] Nov 25 16:10:30 crc kubenswrapper[4743]: I1125 16:10:30.180444 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpxpx\" (UniqueName: \"kubernetes.io/projected/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-kube-api-access-wpxpx\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j\" (UID: \"25b1c299-a6a1-4afe-a1f8-9c04410a01a0\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" Nov 25 16:10:30 crc kubenswrapper[4743]: I1125 16:10:30.180745 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j\" (UID: \"25b1c299-a6a1-4afe-a1f8-9c04410a01a0\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" Nov 25 16:10:30 crc kubenswrapper[4743]: I1125 16:10:30.180863 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j\" (UID: \"25b1c299-a6a1-4afe-a1f8-9c04410a01a0\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" Nov 25 16:10:30 crc kubenswrapper[4743]: I1125 16:10:30.281751 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpxpx\" (UniqueName: \"kubernetes.io/projected/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-kube-api-access-wpxpx\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j\" (UID: \"25b1c299-a6a1-4afe-a1f8-9c04410a01a0\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" Nov 25 16:10:30 crc kubenswrapper[4743]: I1125 16:10:30.281810 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j\" (UID: \"25b1c299-a6a1-4afe-a1f8-9c04410a01a0\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" Nov 25 16:10:30 crc kubenswrapper[4743]: I1125 16:10:30.281848 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j\" (UID: \"25b1c299-a6a1-4afe-a1f8-9c04410a01a0\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" Nov 25 16:10:30 crc kubenswrapper[4743]: I1125 16:10:30.282393 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j\" (UID: \"25b1c299-a6a1-4afe-a1f8-9c04410a01a0\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" Nov 25 16:10:30 crc kubenswrapper[4743]: I1125 16:10:30.282480 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j\" (UID: \"25b1c299-a6a1-4afe-a1f8-9c04410a01a0\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" Nov 25 16:10:30 crc kubenswrapper[4743]: I1125 16:10:30.299261 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpxpx\" (UniqueName: \"kubernetes.io/projected/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-kube-api-access-wpxpx\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j\" (UID: \"25b1c299-a6a1-4afe-a1f8-9c04410a01a0\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" Nov 25 16:10:30 crc kubenswrapper[4743]: I1125 16:10:30.396873 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" Nov 25 16:10:30 crc kubenswrapper[4743]: I1125 16:10:30.772872 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j"] Nov 25 16:10:31 crc kubenswrapper[4743]: I1125 16:10:31.590282 4743 generic.go:334] "Generic (PLEG): container finished" podID="25b1c299-a6a1-4afe-a1f8-9c04410a01a0" containerID="7106b5991ad6612fcd1754661387f03044963d3f3c7b5ae4ccd981fc8fcbbac9" exitCode=0 Nov 25 16:10:31 crc kubenswrapper[4743]: I1125 16:10:31.590734 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" event={"ID":"25b1c299-a6a1-4afe-a1f8-9c04410a01a0","Type":"ContainerDied","Data":"7106b5991ad6612fcd1754661387f03044963d3f3c7b5ae4ccd981fc8fcbbac9"} Nov 25 16:10:31 crc kubenswrapper[4743]: I1125 16:10:31.591343 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" event={"ID":"25b1c299-a6a1-4afe-a1f8-9c04410a01a0","Type":"ContainerStarted","Data":"af6fe64f3793c45486610dfefe02f0d220d82a8a38af91c11fca7067df4539a3"} Nov 25 16:10:33 crc kubenswrapper[4743]: I1125 16:10:33.551798 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-4sghb" podUID="d48456d6-9cf0-4cce-8623-9e4b2ff85ab0" containerName="console" containerID="cri-o://da5209388ccb33b51f2d9f51875b705027023931afe568eea189ee9efc5ae16e" gracePeriod=15 Nov 25 16:10:33 crc kubenswrapper[4743]: I1125 16:10:33.608924 4743 generic.go:334] "Generic (PLEG): container finished" podID="25b1c299-a6a1-4afe-a1f8-9c04410a01a0" containerID="7e0d2f0c0474b73f4cb549b337a3e03c5d9d91f88c633c0ef62a1d99851e3540" exitCode=0 Nov 25 16:10:33 crc kubenswrapper[4743]: I1125 16:10:33.609085 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" event={"ID":"25b1c299-a6a1-4afe-a1f8-9c04410a01a0","Type":"ContainerDied","Data":"7e0d2f0c0474b73f4cb549b337a3e03c5d9d91f88c633c0ef62a1d99851e3540"} Nov 25 16:10:33 crc kubenswrapper[4743]: I1125 16:10:33.945399 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4sghb_d48456d6-9cf0-4cce-8623-9e4b2ff85ab0/console/0.log" Nov 25 16:10:33 crc kubenswrapper[4743]: I1125 16:10:33.945476 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.029844 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-config\") pod \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.029896 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-service-ca\") pod \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.029918 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-oauth-serving-cert\") pod \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.029934 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-oauth-config\") pod \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.029964 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-trusted-ca-bundle\") pod \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.029993 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-serving-cert\") pod \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.030037 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zbbq\" (UniqueName: \"kubernetes.io/projected/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-kube-api-access-4zbbq\") pod \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\" (UID: \"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0\") " Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.030608 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-config" (OuterVolumeSpecName: "console-config") pod "d48456d6-9cf0-4cce-8623-9e4b2ff85ab0" (UID: "d48456d6-9cf0-4cce-8623-9e4b2ff85ab0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.030715 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d48456d6-9cf0-4cce-8623-9e4b2ff85ab0" (UID: "d48456d6-9cf0-4cce-8623-9e4b2ff85ab0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.030815 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d48456d6-9cf0-4cce-8623-9e4b2ff85ab0" (UID: "d48456d6-9cf0-4cce-8623-9e4b2ff85ab0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.030900 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-service-ca" (OuterVolumeSpecName: "service-ca") pod "d48456d6-9cf0-4cce-8623-9e4b2ff85ab0" (UID: "d48456d6-9cf0-4cce-8623-9e4b2ff85ab0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.036142 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d48456d6-9cf0-4cce-8623-9e4b2ff85ab0" (UID: "d48456d6-9cf0-4cce-8623-9e4b2ff85ab0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.036243 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-kube-api-access-4zbbq" (OuterVolumeSpecName: "kube-api-access-4zbbq") pod "d48456d6-9cf0-4cce-8623-9e4b2ff85ab0" (UID: "d48456d6-9cf0-4cce-8623-9e4b2ff85ab0"). InnerVolumeSpecName "kube-api-access-4zbbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.036414 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d48456d6-9cf0-4cce-8623-9e4b2ff85ab0" (UID: "d48456d6-9cf0-4cce-8623-9e4b2ff85ab0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.130960 4743 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.131039 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.131052 4743 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.131060 4743 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.131071 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.131084 4743 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.131095 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zbbq\" (UniqueName: \"kubernetes.io/projected/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0-kube-api-access-4zbbq\") on node \"crc\" DevicePath \"\"" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.617098 4743 generic.go:334] "Generic (PLEG): container finished" podID="25b1c299-a6a1-4afe-a1f8-9c04410a01a0" containerID="794616fb1e4b0c78aad6fe38a18294c65ef6e1509fe3a46c8ddfc5399e7a8b14" exitCode=0 Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.617154 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" event={"ID":"25b1c299-a6a1-4afe-a1f8-9c04410a01a0","Type":"ContainerDied","Data":"794616fb1e4b0c78aad6fe38a18294c65ef6e1509fe3a46c8ddfc5399e7a8b14"} Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.619963 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4sghb_d48456d6-9cf0-4cce-8623-9e4b2ff85ab0/console/0.log" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.620021 4743 generic.go:334] "Generic (PLEG): container finished" podID="d48456d6-9cf0-4cce-8623-9e4b2ff85ab0" containerID="da5209388ccb33b51f2d9f51875b705027023931afe568eea189ee9efc5ae16e" exitCode=2 Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.620058 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4sghb" event={"ID":"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0","Type":"ContainerDied","Data":"da5209388ccb33b51f2d9f51875b705027023931afe568eea189ee9efc5ae16e"} Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.620069 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4sghb" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.620086 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4sghb" event={"ID":"d48456d6-9cf0-4cce-8623-9e4b2ff85ab0","Type":"ContainerDied","Data":"25649e0aea91d895fa9ebc882dccb6c14b84714fc7d19306819b3cef8660de1c"} Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.620104 4743 scope.go:117] "RemoveContainer" containerID="da5209388ccb33b51f2d9f51875b705027023931afe568eea189ee9efc5ae16e" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.643311 4743 scope.go:117] "RemoveContainer" containerID="da5209388ccb33b51f2d9f51875b705027023931afe568eea189ee9efc5ae16e" Nov 25 16:10:34 crc kubenswrapper[4743]: E1125 16:10:34.643855 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da5209388ccb33b51f2d9f51875b705027023931afe568eea189ee9efc5ae16e\": container with ID starting with da5209388ccb33b51f2d9f51875b705027023931afe568eea189ee9efc5ae16e not found: ID does not exist" containerID="da5209388ccb33b51f2d9f51875b705027023931afe568eea189ee9efc5ae16e" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.643919 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5209388ccb33b51f2d9f51875b705027023931afe568eea189ee9efc5ae16e"} err="failed to get container status \"da5209388ccb33b51f2d9f51875b705027023931afe568eea189ee9efc5ae16e\": rpc error: code = NotFound desc = could not find container \"da5209388ccb33b51f2d9f51875b705027023931afe568eea189ee9efc5ae16e\": container with ID starting with da5209388ccb33b51f2d9f51875b705027023931afe568eea189ee9efc5ae16e not found: ID does not exist" Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.652269 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4sghb"] Nov 25 16:10:34 crc kubenswrapper[4743]: I1125 16:10:34.655299 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-4sghb"] Nov 25 16:10:35 crc kubenswrapper[4743]: I1125 16:10:35.783120 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d48456d6-9cf0-4cce-8623-9e4b2ff85ab0" path="/var/lib/kubelet/pods/d48456d6-9cf0-4cce-8623-9e4b2ff85ab0/volumes" Nov 25 16:10:35 crc kubenswrapper[4743]: I1125 16:10:35.847613 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" Nov 25 16:10:35 crc kubenswrapper[4743]: I1125 16:10:35.853475 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpxpx\" (UniqueName: \"kubernetes.io/projected/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-kube-api-access-wpxpx\") pod \"25b1c299-a6a1-4afe-a1f8-9c04410a01a0\" (UID: \"25b1c299-a6a1-4afe-a1f8-9c04410a01a0\") " Nov 25 16:10:35 crc kubenswrapper[4743]: I1125 16:10:35.853619 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-bundle\") pod \"25b1c299-a6a1-4afe-a1f8-9c04410a01a0\" (UID: \"25b1c299-a6a1-4afe-a1f8-9c04410a01a0\") " Nov 25 16:10:35 crc kubenswrapper[4743]: I1125 16:10:35.853691 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-util\") pod \"25b1c299-a6a1-4afe-a1f8-9c04410a01a0\" (UID: \"25b1c299-a6a1-4afe-a1f8-9c04410a01a0\") " Nov 25 16:10:35 crc kubenswrapper[4743]: I1125 16:10:35.855103 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-bundle" (OuterVolumeSpecName: "bundle") pod "25b1c299-a6a1-4afe-a1f8-9c04410a01a0" (UID: "25b1c299-a6a1-4afe-a1f8-9c04410a01a0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:10:35 crc kubenswrapper[4743]: I1125 16:10:35.857993 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-kube-api-access-wpxpx" (OuterVolumeSpecName: "kube-api-access-wpxpx") pod "25b1c299-a6a1-4afe-a1f8-9c04410a01a0" (UID: "25b1c299-a6a1-4afe-a1f8-9c04410a01a0"). InnerVolumeSpecName "kube-api-access-wpxpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:10:35 crc kubenswrapper[4743]: I1125 16:10:35.875531 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-util" (OuterVolumeSpecName: "util") pod "25b1c299-a6a1-4afe-a1f8-9c04410a01a0" (UID: "25b1c299-a6a1-4afe-a1f8-9c04410a01a0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:10:35 crc kubenswrapper[4743]: I1125 16:10:35.955507 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpxpx\" (UniqueName: \"kubernetes.io/projected/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-kube-api-access-wpxpx\") on node \"crc\" DevicePath \"\"" Nov 25 16:10:35 crc kubenswrapper[4743]: I1125 16:10:35.955552 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:10:35 crc kubenswrapper[4743]: I1125 16:10:35.955562 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25b1c299-a6a1-4afe-a1f8-9c04410a01a0-util\") on node \"crc\" DevicePath \"\"" Nov 25 16:10:36 crc kubenswrapper[4743]: I1125 16:10:36.634548 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" event={"ID":"25b1c299-a6a1-4afe-a1f8-9c04410a01a0","Type":"ContainerDied","Data":"af6fe64f3793c45486610dfefe02f0d220d82a8a38af91c11fca7067df4539a3"} Nov 25 16:10:36 crc kubenswrapper[4743]: I1125 16:10:36.634587 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j" Nov 25 16:10:36 crc kubenswrapper[4743]: I1125 16:10:36.634612 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af6fe64f3793c45486610dfefe02f0d220d82a8a38af91c11fca7067df4539a3" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.173663 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49"] Nov 25 16:10:46 crc kubenswrapper[4743]: E1125 16:10:46.174475 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b1c299-a6a1-4afe-a1f8-9c04410a01a0" containerName="extract" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.174490 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b1c299-a6a1-4afe-a1f8-9c04410a01a0" containerName="extract" Nov 25 16:10:46 crc kubenswrapper[4743]: E1125 16:10:46.174501 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48456d6-9cf0-4cce-8623-9e4b2ff85ab0" containerName="console" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.174508 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48456d6-9cf0-4cce-8623-9e4b2ff85ab0" containerName="console" Nov 25 16:10:46 crc kubenswrapper[4743]: E1125 16:10:46.174530 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b1c299-a6a1-4afe-a1f8-9c04410a01a0" containerName="pull" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.174538 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b1c299-a6a1-4afe-a1f8-9c04410a01a0" containerName="pull" Nov 25 16:10:46 crc kubenswrapper[4743]: E1125 16:10:46.174554 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b1c299-a6a1-4afe-a1f8-9c04410a01a0" containerName="util" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.174561 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b1c299-a6a1-4afe-a1f8-9c04410a01a0" containerName="util" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.174685 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d48456d6-9cf0-4cce-8623-9e4b2ff85ab0" containerName="console" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.174707 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="25b1c299-a6a1-4afe-a1f8-9c04410a01a0" containerName="extract" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.175167 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.186721 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.186859 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.186885 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.186949 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-s4bg7" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.186811 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.207108 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49"] Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.291581 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/03ed5f22-b285-4560-8572-798606c90e7b-webhook-cert\") pod \"metallb-operator-controller-manager-b6fb67cbb-wpj49\" (UID: \"03ed5f22-b285-4560-8572-798606c90e7b\") " pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.291958 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g8gr\" (UniqueName: \"kubernetes.io/projected/03ed5f22-b285-4560-8572-798606c90e7b-kube-api-access-7g8gr\") pod \"metallb-operator-controller-manager-b6fb67cbb-wpj49\" (UID: \"03ed5f22-b285-4560-8572-798606c90e7b\") " pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.292108 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/03ed5f22-b285-4560-8572-798606c90e7b-apiservice-cert\") pod \"metallb-operator-controller-manager-b6fb67cbb-wpj49\" (UID: \"03ed5f22-b285-4560-8572-798606c90e7b\") " pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.393552 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/03ed5f22-b285-4560-8572-798606c90e7b-apiservice-cert\") pod \"metallb-operator-controller-manager-b6fb67cbb-wpj49\" (UID: \"03ed5f22-b285-4560-8572-798606c90e7b\") " pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.393974 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/03ed5f22-b285-4560-8572-798606c90e7b-webhook-cert\") pod \"metallb-operator-controller-manager-b6fb67cbb-wpj49\" (UID: \"03ed5f22-b285-4560-8572-798606c90e7b\") " pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.394652 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g8gr\" (UniqueName: \"kubernetes.io/projected/03ed5f22-b285-4560-8572-798606c90e7b-kube-api-access-7g8gr\") pod \"metallb-operator-controller-manager-b6fb67cbb-wpj49\" (UID: \"03ed5f22-b285-4560-8572-798606c90e7b\") " pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.402297 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8"] Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.402976 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.404726 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/03ed5f22-b285-4560-8572-798606c90e7b-webhook-cert\") pod \"metallb-operator-controller-manager-b6fb67cbb-wpj49\" (UID: \"03ed5f22-b285-4560-8572-798606c90e7b\") " pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.406125 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/03ed5f22-b285-4560-8572-798606c90e7b-apiservice-cert\") pod \"metallb-operator-controller-manager-b6fb67cbb-wpj49\" (UID: \"03ed5f22-b285-4560-8572-798606c90e7b\") " pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.418007 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.418366 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-8n2gs" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.418623 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.422781 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8"] Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.423696 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g8gr\" (UniqueName: \"kubernetes.io/projected/03ed5f22-b285-4560-8572-798606c90e7b-kube-api-access-7g8gr\") pod \"metallb-operator-controller-manager-b6fb67cbb-wpj49\" (UID: \"03ed5f22-b285-4560-8572-798606c90e7b\") " pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.492016 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.495708 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zkdj\" (UniqueName: \"kubernetes.io/projected/bed7e486-cad7-437c-8196-4fc08dd20eb6-kube-api-access-9zkdj\") pod \"metallb-operator-webhook-server-84c6f5f694-f9nf8\" (UID: \"bed7e486-cad7-437c-8196-4fc08dd20eb6\") " pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.495762 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bed7e486-cad7-437c-8196-4fc08dd20eb6-apiservice-cert\") pod \"metallb-operator-webhook-server-84c6f5f694-f9nf8\" (UID: \"bed7e486-cad7-437c-8196-4fc08dd20eb6\") " pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.495903 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bed7e486-cad7-437c-8196-4fc08dd20eb6-webhook-cert\") pod \"metallb-operator-webhook-server-84c6f5f694-f9nf8\" (UID: \"bed7e486-cad7-437c-8196-4fc08dd20eb6\") " pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.600853 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bed7e486-cad7-437c-8196-4fc08dd20eb6-apiservice-cert\") pod \"metallb-operator-webhook-server-84c6f5f694-f9nf8\" (UID: \"bed7e486-cad7-437c-8196-4fc08dd20eb6\") " pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.601204 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bed7e486-cad7-437c-8196-4fc08dd20eb6-webhook-cert\") pod \"metallb-operator-webhook-server-84c6f5f694-f9nf8\" (UID: \"bed7e486-cad7-437c-8196-4fc08dd20eb6\") " pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.601246 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zkdj\" (UniqueName: \"kubernetes.io/projected/bed7e486-cad7-437c-8196-4fc08dd20eb6-kube-api-access-9zkdj\") pod \"metallb-operator-webhook-server-84c6f5f694-f9nf8\" (UID: \"bed7e486-cad7-437c-8196-4fc08dd20eb6\") " pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.609237 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bed7e486-cad7-437c-8196-4fc08dd20eb6-apiservice-cert\") pod \"metallb-operator-webhook-server-84c6f5f694-f9nf8\" (UID: \"bed7e486-cad7-437c-8196-4fc08dd20eb6\") " pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.609818 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bed7e486-cad7-437c-8196-4fc08dd20eb6-webhook-cert\") pod \"metallb-operator-webhook-server-84c6f5f694-f9nf8\" (UID: \"bed7e486-cad7-437c-8196-4fc08dd20eb6\") " pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.626520 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zkdj\" (UniqueName: \"kubernetes.io/projected/bed7e486-cad7-437c-8196-4fc08dd20eb6-kube-api-access-9zkdj\") pod \"metallb-operator-webhook-server-84c6f5f694-f9nf8\" (UID: \"bed7e486-cad7-437c-8196-4fc08dd20eb6\") " pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.714782 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49"] Nov 25 16:10:46 crc kubenswrapper[4743]: W1125 16:10:46.717938 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03ed5f22_b285_4560_8572_798606c90e7b.slice/crio-a11248006c3eebf20a9d9460de3b35959f7fcb7cd68c1429890e63a61a102611 WatchSource:0}: Error finding container a11248006c3eebf20a9d9460de3b35959f7fcb7cd68c1429890e63a61a102611: Status 404 returned error can't find the container with id a11248006c3eebf20a9d9460de3b35959f7fcb7cd68c1429890e63a61a102611 Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.769694 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" Nov 25 16:10:46 crc kubenswrapper[4743]: I1125 16:10:46.968830 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8"] Nov 25 16:10:46 crc kubenswrapper[4743]: W1125 16:10:46.975135 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbed7e486_cad7_437c_8196_4fc08dd20eb6.slice/crio-c80755dcef9ccea1d5a679bd0fb3ece124f0bdf57327ea55c388f3c509d4cc85 WatchSource:0}: Error finding container c80755dcef9ccea1d5a679bd0fb3ece124f0bdf57327ea55c388f3c509d4cc85: Status 404 returned error can't find the container with id c80755dcef9ccea1d5a679bd0fb3ece124f0bdf57327ea55c388f3c509d4cc85 Nov 25 16:10:47 crc kubenswrapper[4743]: I1125 16:10:47.698852 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" event={"ID":"bed7e486-cad7-437c-8196-4fc08dd20eb6","Type":"ContainerStarted","Data":"c80755dcef9ccea1d5a679bd0fb3ece124f0bdf57327ea55c388f3c509d4cc85"} Nov 25 16:10:47 crc kubenswrapper[4743]: I1125 16:10:47.700413 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" event={"ID":"03ed5f22-b285-4560-8572-798606c90e7b","Type":"ContainerStarted","Data":"a11248006c3eebf20a9d9460de3b35959f7fcb7cd68c1429890e63a61a102611"} Nov 25 16:10:50 crc kubenswrapper[4743]: I1125 16:10:50.077851 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:10:50 crc kubenswrapper[4743]: I1125 16:10:50.078194 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:10:51 crc kubenswrapper[4743]: I1125 16:10:51.724482 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" event={"ID":"bed7e486-cad7-437c-8196-4fc08dd20eb6","Type":"ContainerStarted","Data":"12a4a1c87870ed9a98a68b1b25863319b69527000104f3af0ce8c7f33700e1fc"} Nov 25 16:10:51 crc kubenswrapper[4743]: I1125 16:10:51.724856 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" Nov 25 16:10:51 crc kubenswrapper[4743]: I1125 16:10:51.726016 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" event={"ID":"03ed5f22-b285-4560-8572-798606c90e7b","Type":"ContainerStarted","Data":"429a9d149f2b2aa552c0e57be69d6eaecc796e86ff5df38dc59344215874a66a"} Nov 25 16:10:51 crc kubenswrapper[4743]: I1125 16:10:51.726397 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" Nov 25 16:10:51 crc kubenswrapper[4743]: I1125 16:10:51.743116 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" podStartSLOduration=1.779563409 podStartE2EDuration="5.74309914s" podCreationTimestamp="2025-11-25 16:10:46 +0000 UTC" firstStartedPulling="2025-11-25 16:10:46.978505526 +0000 UTC m=+726.100345075" lastFinishedPulling="2025-11-25 16:10:50.942041257 +0000 UTC m=+730.063880806" observedRunningTime="2025-11-25 16:10:51.742403988 +0000 UTC m=+730.864243557" watchObservedRunningTime="2025-11-25 16:10:51.74309914 +0000 UTC m=+730.864938689" Nov 25 16:10:51 crc kubenswrapper[4743]: I1125 16:10:51.765763 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" podStartSLOduration=1.563461426 podStartE2EDuration="5.765745019s" podCreationTimestamp="2025-11-25 16:10:46 +0000 UTC" firstStartedPulling="2025-11-25 16:10:46.720732058 +0000 UTC m=+725.842571607" lastFinishedPulling="2025-11-25 16:10:50.923015631 +0000 UTC m=+730.044855200" observedRunningTime="2025-11-25 16:10:51.762139166 +0000 UTC m=+730.883978725" watchObservedRunningTime="2025-11-25 16:10:51.765745019 +0000 UTC m=+730.887584568" Nov 25 16:11:06 crc kubenswrapper[4743]: I1125 16:11:06.774583 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-84c6f5f694-f9nf8" Nov 25 16:11:18 crc kubenswrapper[4743]: I1125 16:11:18.934761 4743 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 25 16:11:20 crc kubenswrapper[4743]: I1125 16:11:20.077126 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:11:20 crc kubenswrapper[4743]: I1125 16:11:20.077470 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:11:20 crc kubenswrapper[4743]: I1125 16:11:20.077522 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 16:11:20 crc kubenswrapper[4743]: I1125 16:11:20.078109 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb34634469d7134691d6897c55890dcb1975da56bf10ed444c930c92d7b2c025"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:11:20 crc kubenswrapper[4743]: I1125 16:11:20.078185 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://cb34634469d7134691d6897c55890dcb1975da56bf10ed444c930c92d7b2c025" gracePeriod=600 Nov 25 16:11:20 crc kubenswrapper[4743]: I1125 16:11:20.895418 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="cb34634469d7134691d6897c55890dcb1975da56bf10ed444c930c92d7b2c025" exitCode=0 Nov 25 16:11:20 crc kubenswrapper[4743]: I1125 16:11:20.895498 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"cb34634469d7134691d6897c55890dcb1975da56bf10ed444c930c92d7b2c025"} Nov 25 16:11:20 crc kubenswrapper[4743]: I1125 16:11:20.895796 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"3891bef80e07425c4fd47953c65e853bb10f31ef01a35da0d32440f2ed3b5e2c"} Nov 25 16:11:20 crc kubenswrapper[4743]: I1125 16:11:20.895820 4743 scope.go:117] "RemoveContainer" containerID="40809af90426567c28633f54d9909efc25bbfed89b36875fb90d82bda2d56570" Nov 25 16:11:26 crc kubenswrapper[4743]: I1125 16:11:26.495240 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-b6fb67cbb-wpj49" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.242316 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-nv4nd"] Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.245260 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.250946 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.251182 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.251417 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-svd2v" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.253152 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6"] Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.253900 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.256497 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.272492 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6"] Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.308006 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec-cert\") pod \"frr-k8s-webhook-server-6998585d5-q2tm6\" (UID: \"77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.308093 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsxd9\" (UniqueName: \"kubernetes.io/projected/77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec-kube-api-access-lsxd9\") pod \"frr-k8s-webhook-server-6998585d5-q2tm6\" (UID: \"77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.308147 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ddf4605c-5031-4d69-9b22-a49126d26f66-metrics\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.308197 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhmqx\" (UniqueName: \"kubernetes.io/projected/ddf4605c-5031-4d69-9b22-a49126d26f66-kube-api-access-hhmqx\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.308313 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ddf4605c-5031-4d69-9b22-a49126d26f66-frr-startup\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.308396 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ddf4605c-5031-4d69-9b22-a49126d26f66-frr-sockets\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.308437 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddf4605c-5031-4d69-9b22-a49126d26f66-metrics-certs\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.308476 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ddf4605c-5031-4d69-9b22-a49126d26f66-reloader\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.308584 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ddf4605c-5031-4d69-9b22-a49126d26f66-frr-conf\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.337842 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-8f677"] Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.339184 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8f677" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.341436 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.341673 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.341892 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.342093 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-7r452" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.352308 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-685f4"] Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.353161 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-685f4" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.355543 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.369809 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-685f4"] Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410231 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpm8g\" (UniqueName: \"kubernetes.io/projected/d805cc14-bb31-4762-9079-dedb5e33e391-kube-api-access-vpm8g\") pod \"controller-6c7b4b5f48-685f4\" (UID: \"d805cc14-bb31-4762-9079-dedb5e33e391\") " pod="metallb-system/controller-6c7b4b5f48-685f4" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410286 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fd919f22-093b-4ba9-bbc1-06a5360f6f32-memberlist\") pod \"speaker-8f677\" (UID: \"fd919f22-093b-4ba9-bbc1-06a5360f6f32\") " pod="metallb-system/speaker-8f677" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410310 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rsb8\" (UniqueName: \"kubernetes.io/projected/fd919f22-093b-4ba9-bbc1-06a5360f6f32-kube-api-access-8rsb8\") pod \"speaker-8f677\" (UID: \"fd919f22-093b-4ba9-bbc1-06a5360f6f32\") " pod="metallb-system/speaker-8f677" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410339 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ddf4605c-5031-4d69-9b22-a49126d26f66-frr-startup\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410359 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d805cc14-bb31-4762-9079-dedb5e33e391-cert\") pod \"controller-6c7b4b5f48-685f4\" (UID: \"d805cc14-bb31-4762-9079-dedb5e33e391\") " pod="metallb-system/controller-6c7b4b5f48-685f4" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410381 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d805cc14-bb31-4762-9079-dedb5e33e391-metrics-certs\") pod \"controller-6c7b4b5f48-685f4\" (UID: \"d805cc14-bb31-4762-9079-dedb5e33e391\") " pod="metallb-system/controller-6c7b4b5f48-685f4" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410467 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ddf4605c-5031-4d69-9b22-a49126d26f66-frr-sockets\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410502 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddf4605c-5031-4d69-9b22-a49126d26f66-metrics-certs\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ddf4605c-5031-4d69-9b22-a49126d26f66-reloader\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410574 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ddf4605c-5031-4d69-9b22-a49126d26f66-frr-conf\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410619 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec-cert\") pod \"frr-k8s-webhook-server-6998585d5-q2tm6\" (UID: \"77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410640 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd919f22-093b-4ba9-bbc1-06a5360f6f32-metrics-certs\") pod \"speaker-8f677\" (UID: \"fd919f22-093b-4ba9-bbc1-06a5360f6f32\") " pod="metallb-system/speaker-8f677" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410667 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fd919f22-093b-4ba9-bbc1-06a5360f6f32-metallb-excludel2\") pod \"speaker-8f677\" (UID: \"fd919f22-093b-4ba9-bbc1-06a5360f6f32\") " pod="metallb-system/speaker-8f677" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410710 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsxd9\" (UniqueName: \"kubernetes.io/projected/77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec-kube-api-access-lsxd9\") pod \"frr-k8s-webhook-server-6998585d5-q2tm6\" (UID: \"77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410756 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ddf4605c-5031-4d69-9b22-a49126d26f66-metrics\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.410810 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhmqx\" (UniqueName: \"kubernetes.io/projected/ddf4605c-5031-4d69-9b22-a49126d26f66-kube-api-access-hhmqx\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: E1125 16:11:27.411104 4743 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Nov 25 16:11:27 crc kubenswrapper[4743]: E1125 16:11:27.411159 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec-cert podName:77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec nodeName:}" failed. No retries permitted until 2025-11-25 16:11:27.911139138 +0000 UTC m=+767.032978687 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec-cert") pod "frr-k8s-webhook-server-6998585d5-q2tm6" (UID: "77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec") : secret "frr-k8s-webhook-server-cert" not found Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.411229 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ddf4605c-5031-4d69-9b22-a49126d26f66-frr-startup\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.411274 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ddf4605c-5031-4d69-9b22-a49126d26f66-metrics\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.411443 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ddf4605c-5031-4d69-9b22-a49126d26f66-frr-conf\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.412047 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ddf4605c-5031-4d69-9b22-a49126d26f66-reloader\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.412450 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ddf4605c-5031-4d69-9b22-a49126d26f66-frr-sockets\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.416697 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddf4605c-5031-4d69-9b22-a49126d26f66-metrics-certs\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.427725 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsxd9\" (UniqueName: \"kubernetes.io/projected/77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec-kube-api-access-lsxd9\") pod \"frr-k8s-webhook-server-6998585d5-q2tm6\" (UID: \"77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.428901 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhmqx\" (UniqueName: \"kubernetes.io/projected/ddf4605c-5031-4d69-9b22-a49126d26f66-kube-api-access-hhmqx\") pod \"frr-k8s-nv4nd\" (UID: \"ddf4605c-5031-4d69-9b22-a49126d26f66\") " pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.511405 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpm8g\" (UniqueName: \"kubernetes.io/projected/d805cc14-bb31-4762-9079-dedb5e33e391-kube-api-access-vpm8g\") pod \"controller-6c7b4b5f48-685f4\" (UID: \"d805cc14-bb31-4762-9079-dedb5e33e391\") " pod="metallb-system/controller-6c7b4b5f48-685f4" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.511446 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fd919f22-093b-4ba9-bbc1-06a5360f6f32-memberlist\") pod \"speaker-8f677\" (UID: \"fd919f22-093b-4ba9-bbc1-06a5360f6f32\") " pod="metallb-system/speaker-8f677" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.511465 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rsb8\" (UniqueName: \"kubernetes.io/projected/fd919f22-093b-4ba9-bbc1-06a5360f6f32-kube-api-access-8rsb8\") pod \"speaker-8f677\" (UID: \"fd919f22-093b-4ba9-bbc1-06a5360f6f32\") " pod="metallb-system/speaker-8f677" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.511488 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d805cc14-bb31-4762-9079-dedb5e33e391-cert\") pod \"controller-6c7b4b5f48-685f4\" (UID: \"d805cc14-bb31-4762-9079-dedb5e33e391\") " pod="metallb-system/controller-6c7b4b5f48-685f4" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.511503 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d805cc14-bb31-4762-9079-dedb5e33e391-metrics-certs\") pod \"controller-6c7b4b5f48-685f4\" (UID: \"d805cc14-bb31-4762-9079-dedb5e33e391\") " pod="metallb-system/controller-6c7b4b5f48-685f4" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.511536 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd919f22-093b-4ba9-bbc1-06a5360f6f32-metrics-certs\") pod \"speaker-8f677\" (UID: \"fd919f22-093b-4ba9-bbc1-06a5360f6f32\") " pod="metallb-system/speaker-8f677" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.511562 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fd919f22-093b-4ba9-bbc1-06a5360f6f32-metallb-excludel2\") pod \"speaker-8f677\" (UID: \"fd919f22-093b-4ba9-bbc1-06a5360f6f32\") " pod="metallb-system/speaker-8f677" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.512278 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fd919f22-093b-4ba9-bbc1-06a5360f6f32-metallb-excludel2\") pod \"speaker-8f677\" (UID: \"fd919f22-093b-4ba9-bbc1-06a5360f6f32\") " pod="metallb-system/speaker-8f677" Nov 25 16:11:27 crc kubenswrapper[4743]: E1125 16:11:27.512659 4743 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 25 16:11:27 crc kubenswrapper[4743]: E1125 16:11:27.512761 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd919f22-093b-4ba9-bbc1-06a5360f6f32-memberlist podName:fd919f22-093b-4ba9-bbc1-06a5360f6f32 nodeName:}" failed. No retries permitted until 2025-11-25 16:11:28.012739658 +0000 UTC m=+767.134579207 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/fd919f22-093b-4ba9-bbc1-06a5360f6f32-memberlist") pod "speaker-8f677" (UID: "fd919f22-093b-4ba9-bbc1-06a5360f6f32") : secret "metallb-memberlist" not found Nov 25 16:11:27 crc kubenswrapper[4743]: E1125 16:11:27.512673 4743 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Nov 25 16:11:27 crc kubenswrapper[4743]: E1125 16:11:27.512933 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d805cc14-bb31-4762-9079-dedb5e33e391-metrics-certs podName:d805cc14-bb31-4762-9079-dedb5e33e391 nodeName:}" failed. No retries permitted until 2025-11-25 16:11:28.012924054 +0000 UTC m=+767.134763713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d805cc14-bb31-4762-9079-dedb5e33e391-metrics-certs") pod "controller-6c7b4b5f48-685f4" (UID: "d805cc14-bb31-4762-9079-dedb5e33e391") : secret "controller-certs-secret" not found Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.515407 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d805cc14-bb31-4762-9079-dedb5e33e391-cert\") pod \"controller-6c7b4b5f48-685f4\" (UID: \"d805cc14-bb31-4762-9079-dedb5e33e391\") " pod="metallb-system/controller-6c7b4b5f48-685f4" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.522140 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd919f22-093b-4ba9-bbc1-06a5360f6f32-metrics-certs\") pod \"speaker-8f677\" (UID: \"fd919f22-093b-4ba9-bbc1-06a5360f6f32\") " pod="metallb-system/speaker-8f677" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.539248 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rsb8\" (UniqueName: \"kubernetes.io/projected/fd919f22-093b-4ba9-bbc1-06a5360f6f32-kube-api-access-8rsb8\") pod \"speaker-8f677\" (UID: \"fd919f22-093b-4ba9-bbc1-06a5360f6f32\") " pod="metallb-system/speaker-8f677" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.556413 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpm8g\" (UniqueName: \"kubernetes.io/projected/d805cc14-bb31-4762-9079-dedb5e33e391-kube-api-access-vpm8g\") pod \"controller-6c7b4b5f48-685f4\" (UID: \"d805cc14-bb31-4762-9079-dedb5e33e391\") " pod="metallb-system/controller-6c7b4b5f48-685f4" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.569935 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.916575 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec-cert\") pod \"frr-k8s-webhook-server-6998585d5-q2tm6\" (UID: \"77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.919895 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec-cert\") pod \"frr-k8s-webhook-server-6998585d5-q2tm6\" (UID: \"77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6" Nov 25 16:11:27 crc kubenswrapper[4743]: I1125 16:11:27.935168 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nv4nd" event={"ID":"ddf4605c-5031-4d69-9b22-a49126d26f66","Type":"ContainerStarted","Data":"b70321f326d904d3ee8f6c7a7a7b51fe4446fa7781f0a5d5d7996bdb1f33d4bb"} Nov 25 16:11:28 crc kubenswrapper[4743]: I1125 16:11:28.018131 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fd919f22-093b-4ba9-bbc1-06a5360f6f32-memberlist\") pod \"speaker-8f677\" (UID: \"fd919f22-093b-4ba9-bbc1-06a5360f6f32\") " pod="metallb-system/speaker-8f677" Nov 25 16:11:28 crc kubenswrapper[4743]: I1125 16:11:28.018188 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d805cc14-bb31-4762-9079-dedb5e33e391-metrics-certs\") pod \"controller-6c7b4b5f48-685f4\" (UID: \"d805cc14-bb31-4762-9079-dedb5e33e391\") " pod="metallb-system/controller-6c7b4b5f48-685f4" Nov 25 16:11:28 crc kubenswrapper[4743]: E1125 16:11:28.018258 4743 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 25 16:11:28 crc kubenswrapper[4743]: E1125 16:11:28.018335 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd919f22-093b-4ba9-bbc1-06a5360f6f32-memberlist podName:fd919f22-093b-4ba9-bbc1-06a5360f6f32 nodeName:}" failed. No retries permitted until 2025-11-25 16:11:29.018311323 +0000 UTC m=+768.140150892 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/fd919f22-093b-4ba9-bbc1-06a5360f6f32-memberlist") pod "speaker-8f677" (UID: "fd919f22-093b-4ba9-bbc1-06a5360f6f32") : secret "metallb-memberlist" not found Nov 25 16:11:28 crc kubenswrapper[4743]: I1125 16:11:28.020922 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d805cc14-bb31-4762-9079-dedb5e33e391-metrics-certs\") pod \"controller-6c7b4b5f48-685f4\" (UID: \"d805cc14-bb31-4762-9079-dedb5e33e391\") " pod="metallb-system/controller-6c7b4b5f48-685f4" Nov 25 16:11:28 crc kubenswrapper[4743]: I1125 16:11:28.185888 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6" Nov 25 16:11:28 crc kubenswrapper[4743]: I1125 16:11:28.272943 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-685f4" Nov 25 16:11:28 crc kubenswrapper[4743]: I1125 16:11:28.471848 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-685f4"] Nov 25 16:11:28 crc kubenswrapper[4743]: I1125 16:11:28.554977 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6"] Nov 25 16:11:28 crc kubenswrapper[4743]: W1125 16:11:28.562966 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77af4b9b_ed3c_4b08_ab42_bd5b3c69cdec.slice/crio-0538bd343cf1fed45577be632d23de2f6d8ad521c9b87abbd352880b4ed7f099 WatchSource:0}: Error finding container 0538bd343cf1fed45577be632d23de2f6d8ad521c9b87abbd352880b4ed7f099: Status 404 returned error can't find the container with id 0538bd343cf1fed45577be632d23de2f6d8ad521c9b87abbd352880b4ed7f099 Nov 25 16:11:28 crc kubenswrapper[4743]: I1125 16:11:28.945095 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-685f4" event={"ID":"d805cc14-bb31-4762-9079-dedb5e33e391","Type":"ContainerStarted","Data":"a7456001b1c240e5d1a3d995ef5878072d6d4e278d8ec90c387fbebd11644ba7"} Nov 25 16:11:28 crc kubenswrapper[4743]: I1125 16:11:28.945473 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-685f4" Nov 25 16:11:28 crc kubenswrapper[4743]: I1125 16:11:28.945493 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-685f4" event={"ID":"d805cc14-bb31-4762-9079-dedb5e33e391","Type":"ContainerStarted","Data":"e9d7d15faf2976a36e28ec12ffa97af98ec8604bb4edded40a331063c35dfb4d"} Nov 25 16:11:28 crc kubenswrapper[4743]: I1125 16:11:28.945506 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-685f4" event={"ID":"d805cc14-bb31-4762-9079-dedb5e33e391","Type":"ContainerStarted","Data":"c911fb373f740acaf92b6cc48b3b870fbe8cf3fc0d9c8a612d073f3c95bd62ba"} Nov 25 16:11:28 crc kubenswrapper[4743]: I1125 16:11:28.948631 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6" event={"ID":"77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec","Type":"ContainerStarted","Data":"0538bd343cf1fed45577be632d23de2f6d8ad521c9b87abbd352880b4ed7f099"} Nov 25 16:11:28 crc kubenswrapper[4743]: I1125 16:11:28.967659 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-685f4" podStartSLOduration=1.9676377779999998 podStartE2EDuration="1.967637778s" podCreationTimestamp="2025-11-25 16:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:11:28.960449552 +0000 UTC m=+768.082289121" watchObservedRunningTime="2025-11-25 16:11:28.967637778 +0000 UTC m=+768.089477327" Nov 25 16:11:29 crc kubenswrapper[4743]: I1125 16:11:29.030166 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fd919f22-093b-4ba9-bbc1-06a5360f6f32-memberlist\") pod \"speaker-8f677\" (UID: \"fd919f22-093b-4ba9-bbc1-06a5360f6f32\") " pod="metallb-system/speaker-8f677" Nov 25 16:11:29 crc kubenswrapper[4743]: I1125 16:11:29.037161 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fd919f22-093b-4ba9-bbc1-06a5360f6f32-memberlist\") pod \"speaker-8f677\" (UID: \"fd919f22-093b-4ba9-bbc1-06a5360f6f32\") " pod="metallb-system/speaker-8f677" Nov 25 16:11:29 crc kubenswrapper[4743]: I1125 16:11:29.167864 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8f677" Nov 25 16:11:29 crc kubenswrapper[4743]: W1125 16:11:29.202460 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd919f22_093b_4ba9_bbc1_06a5360f6f32.slice/crio-e15ff27a299aaa4aab219688ec73f332169f8c30c8619f035eca5cb0985668e2 WatchSource:0}: Error finding container e15ff27a299aaa4aab219688ec73f332169f8c30c8619f035eca5cb0985668e2: Status 404 returned error can't find the container with id e15ff27a299aaa4aab219688ec73f332169f8c30c8619f035eca5cb0985668e2 Nov 25 16:11:29 crc kubenswrapper[4743]: I1125 16:11:29.956137 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8f677" event={"ID":"fd919f22-093b-4ba9-bbc1-06a5360f6f32","Type":"ContainerStarted","Data":"e81edfb584cb4323f6f0e589aefd95a16df314e17d39539f3754032a7f29e622"} Nov 25 16:11:29 crc kubenswrapper[4743]: I1125 16:11:29.956824 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8f677" event={"ID":"fd919f22-093b-4ba9-bbc1-06a5360f6f32","Type":"ContainerStarted","Data":"ce0308586eb381cf262e39b9fda93c833b5467e8af7e2c6c6864be2dd6bc33d0"} Nov 25 16:11:29 crc kubenswrapper[4743]: I1125 16:11:29.956847 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8f677" event={"ID":"fd919f22-093b-4ba9-bbc1-06a5360f6f32","Type":"ContainerStarted","Data":"e15ff27a299aaa4aab219688ec73f332169f8c30c8619f035eca5cb0985668e2"} Nov 25 16:11:29 crc kubenswrapper[4743]: I1125 16:11:29.957114 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-8f677" Nov 25 16:11:31 crc kubenswrapper[4743]: I1125 16:11:31.794043 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-8f677" podStartSLOduration=4.794027445 podStartE2EDuration="4.794027445s" podCreationTimestamp="2025-11-25 16:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:11:29.985463346 +0000 UTC m=+769.107302905" watchObservedRunningTime="2025-11-25 16:11:31.794027445 +0000 UTC m=+770.915866994" Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.109432 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-phn76"] Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.110823 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.123576 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-phn76"] Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.214696 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-utilities\") pod \"community-operators-phn76\" (UID: \"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e\") " pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.214794 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-catalog-content\") pod \"community-operators-phn76\" (UID: \"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e\") " pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.214852 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jfgg\" (UniqueName: \"kubernetes.io/projected/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-kube-api-access-9jfgg\") pod \"community-operators-phn76\" (UID: \"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e\") " pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.316557 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jfgg\" (UniqueName: \"kubernetes.io/projected/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-kube-api-access-9jfgg\") pod \"community-operators-phn76\" (UID: \"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e\") " pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.316666 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-utilities\") pod \"community-operators-phn76\" (UID: \"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e\") " pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.316717 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-catalog-content\") pod \"community-operators-phn76\" (UID: \"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e\") " pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.317232 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-utilities\") pod \"community-operators-phn76\" (UID: \"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e\") " pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.317294 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-catalog-content\") pod \"community-operators-phn76\" (UID: \"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e\") " pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.340045 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jfgg\" (UniqueName: \"kubernetes.io/projected/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-kube-api-access-9jfgg\") pod \"community-operators-phn76\" (UID: \"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e\") " pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.428780 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.757222 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-phn76"] Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.985697 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6" event={"ID":"77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec","Type":"ContainerStarted","Data":"0d6bfdb9ae079aadcd79d070fd9dc829630d96ec1413d48d03ec9b17935b3175"} Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.985847 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6" Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.987559 4743 generic.go:334] "Generic (PLEG): container finished" podID="f6db7da6-bc1b-44ee-8575-b5f468ec9f3e" containerID="680ba710fc23b30239ced487e7bfdcf5a34e975f1f00d9e52e6c97f868939b06" exitCode=0 Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.987656 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phn76" event={"ID":"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e","Type":"ContainerDied","Data":"680ba710fc23b30239ced487e7bfdcf5a34e975f1f00d9e52e6c97f868939b06"} Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.987682 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phn76" event={"ID":"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e","Type":"ContainerStarted","Data":"07e7a324c7b560a16793b701f07e729339973735d9576775cb3f91a15b9b88a1"} Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.990225 4743 generic.go:334] "Generic (PLEG): container finished" podID="ddf4605c-5031-4d69-9b22-a49126d26f66" containerID="d75d55609ff04fe52556d7c5fe01fc69ff11d48a332ae729fa9db78289a9f17b" exitCode=0 Nov 25 16:11:34 crc kubenswrapper[4743]: I1125 16:11:34.990269 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nv4nd" event={"ID":"ddf4605c-5031-4d69-9b22-a49126d26f66","Type":"ContainerDied","Data":"d75d55609ff04fe52556d7c5fe01fc69ff11d48a332ae729fa9db78289a9f17b"} Nov 25 16:11:35 crc kubenswrapper[4743]: I1125 16:11:35.001087 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6" podStartSLOduration=1.982074839 podStartE2EDuration="8.001070066s" podCreationTimestamp="2025-11-25 16:11:27 +0000 UTC" firstStartedPulling="2025-11-25 16:11:28.564417387 +0000 UTC m=+767.686256936" lastFinishedPulling="2025-11-25 16:11:34.583412614 +0000 UTC m=+773.705252163" observedRunningTime="2025-11-25 16:11:34.99861707 +0000 UTC m=+774.120456619" watchObservedRunningTime="2025-11-25 16:11:35.001070066 +0000 UTC m=+774.122909605" Nov 25 16:11:35 crc kubenswrapper[4743]: I1125 16:11:35.997656 4743 generic.go:334] "Generic (PLEG): container finished" podID="ddf4605c-5031-4d69-9b22-a49126d26f66" containerID="bafc633585875e315cf09390feb7f0c98cce56dc2363867bbd27e6a5a5208c1d" exitCode=0 Nov 25 16:11:35 crc kubenswrapper[4743]: I1125 16:11:35.997701 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nv4nd" event={"ID":"ddf4605c-5031-4d69-9b22-a49126d26f66","Type":"ContainerDied","Data":"bafc633585875e315cf09390feb7f0c98cce56dc2363867bbd27e6a5a5208c1d"} Nov 25 16:11:36 crc kubenswrapper[4743]: I1125 16:11:36.000024 4743 generic.go:334] "Generic (PLEG): container finished" podID="f6db7da6-bc1b-44ee-8575-b5f468ec9f3e" containerID="1e41441fbb1f96df61c913adf3cbf6b9d30a529d5ddc67fa17565e15da2a4b16" exitCode=0 Nov 25 16:11:36 crc kubenswrapper[4743]: I1125 16:11:36.000093 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phn76" event={"ID":"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e","Type":"ContainerDied","Data":"1e41441fbb1f96df61c913adf3cbf6b9d30a529d5ddc67fa17565e15da2a4b16"} Nov 25 16:11:37 crc kubenswrapper[4743]: I1125 16:11:37.006938 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phn76" event={"ID":"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e","Type":"ContainerStarted","Data":"af81397897206430ed842da4161d7276f4efff3fe8d2444a0a5275f1c5822d49"} Nov 25 16:11:37 crc kubenswrapper[4743]: I1125 16:11:37.008964 4743 generic.go:334] "Generic (PLEG): container finished" podID="ddf4605c-5031-4d69-9b22-a49126d26f66" containerID="20703532d7c88bf926f7c8307bdc3386b864e399c7759afa26ff1683d4b7c300" exitCode=0 Nov 25 16:11:37 crc kubenswrapper[4743]: I1125 16:11:37.009017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nv4nd" event={"ID":"ddf4605c-5031-4d69-9b22-a49126d26f66","Type":"ContainerDied","Data":"20703532d7c88bf926f7c8307bdc3386b864e399c7759afa26ff1683d4b7c300"} Nov 25 16:11:37 crc kubenswrapper[4743]: I1125 16:11:37.027550 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-phn76" podStartSLOduration=1.589462414 podStartE2EDuration="3.027533216s" podCreationTimestamp="2025-11-25 16:11:34 +0000 UTC" firstStartedPulling="2025-11-25 16:11:34.988732451 +0000 UTC m=+774.110572000" lastFinishedPulling="2025-11-25 16:11:36.426803253 +0000 UTC m=+775.548642802" observedRunningTime="2025-11-25 16:11:37.025622056 +0000 UTC m=+776.147461595" watchObservedRunningTime="2025-11-25 16:11:37.027533216 +0000 UTC m=+776.149372765" Nov 25 16:11:38 crc kubenswrapper[4743]: I1125 16:11:38.025007 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nv4nd" event={"ID":"ddf4605c-5031-4d69-9b22-a49126d26f66","Type":"ContainerStarted","Data":"3c6e7c96894930d088fb6b9f84829df37a30b24d1c9e875f8185a0b5c58c831c"} Nov 25 16:11:38 crc kubenswrapper[4743]: I1125 16:11:38.025339 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nv4nd" event={"ID":"ddf4605c-5031-4d69-9b22-a49126d26f66","Type":"ContainerStarted","Data":"80d57b91d276e2e961ddf1036e330fdf79d3897343b279e9abc331fc118d0a9d"} Nov 25 16:11:38 crc kubenswrapper[4743]: I1125 16:11:38.025350 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nv4nd" event={"ID":"ddf4605c-5031-4d69-9b22-a49126d26f66","Type":"ContainerStarted","Data":"72e3b36bbd718f19f7272b5b17d7aba4fd01f908679d0604b616884685e56fd5"} Nov 25 16:11:38 crc kubenswrapper[4743]: I1125 16:11:38.025379 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nv4nd" event={"ID":"ddf4605c-5031-4d69-9b22-a49126d26f66","Type":"ContainerStarted","Data":"2b533d7f8b31a261ab375d541d4d8b2f801cbe7994e149955c353bfa7b0f9064"} Nov 25 16:11:38 crc kubenswrapper[4743]: I1125 16:11:38.025390 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nv4nd" event={"ID":"ddf4605c-5031-4d69-9b22-a49126d26f66","Type":"ContainerStarted","Data":"a8a584f171fada07270ec3ef53cf80f26b497bf6083fe76f6e630452c27f96c9"} Nov 25 16:11:38 crc kubenswrapper[4743]: I1125 16:11:38.025399 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nv4nd" event={"ID":"ddf4605c-5031-4d69-9b22-a49126d26f66","Type":"ContainerStarted","Data":"e865ee7ff2f94bbf6eceeeb51c7e5cf46ebf782280b992e4e75a9a781b0d0598"} Nov 25 16:11:38 crc kubenswrapper[4743]: I1125 16:11:38.089584 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-nv4nd" podStartSLOduration=4.281014997 podStartE2EDuration="11.089556898s" podCreationTimestamp="2025-11-25 16:11:27 +0000 UTC" firstStartedPulling="2025-11-25 16:11:27.744888724 +0000 UTC m=+766.866728273" lastFinishedPulling="2025-11-25 16:11:34.553430625 +0000 UTC m=+773.675270174" observedRunningTime="2025-11-25 16:11:38.065262248 +0000 UTC m=+777.187101817" watchObservedRunningTime="2025-11-25 16:11:38.089556898 +0000 UTC m=+777.211396447" Nov 25 16:11:38 crc kubenswrapper[4743]: I1125 16:11:38.276389 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-685f4" Nov 25 16:11:39 crc kubenswrapper[4743]: I1125 16:11:39.032206 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:39 crc kubenswrapper[4743]: I1125 16:11:39.172503 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-8f677" Nov 25 16:11:42 crc kubenswrapper[4743]: I1125 16:11:42.281896 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mtwbr"] Nov 25 16:11:42 crc kubenswrapper[4743]: I1125 16:11:42.284397 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mtwbr" Nov 25 16:11:42 crc kubenswrapper[4743]: I1125 16:11:42.287571 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 25 16:11:42 crc kubenswrapper[4743]: I1125 16:11:42.288579 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-nznv5" Nov 25 16:11:42 crc kubenswrapper[4743]: I1125 16:11:42.288807 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 25 16:11:42 crc kubenswrapper[4743]: I1125 16:11:42.311583 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mtwbr"] Nov 25 16:11:42 crc kubenswrapper[4743]: I1125 16:11:42.315897 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxz94\" (UniqueName: \"kubernetes.io/projected/089ebabb-d64e-45b1-acb3-038dc1e26bcc-kube-api-access-wxz94\") pod \"openstack-operator-index-mtwbr\" (UID: \"089ebabb-d64e-45b1-acb3-038dc1e26bcc\") " pod="openstack-operators/openstack-operator-index-mtwbr" Nov 25 16:11:42 crc kubenswrapper[4743]: I1125 16:11:42.417392 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxz94\" (UniqueName: \"kubernetes.io/projected/089ebabb-d64e-45b1-acb3-038dc1e26bcc-kube-api-access-wxz94\") pod \"openstack-operator-index-mtwbr\" (UID: \"089ebabb-d64e-45b1-acb3-038dc1e26bcc\") " pod="openstack-operators/openstack-operator-index-mtwbr" Nov 25 16:11:42 crc kubenswrapper[4743]: I1125 16:11:42.439379 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxz94\" (UniqueName: \"kubernetes.io/projected/089ebabb-d64e-45b1-acb3-038dc1e26bcc-kube-api-access-wxz94\") pod \"openstack-operator-index-mtwbr\" (UID: \"089ebabb-d64e-45b1-acb3-038dc1e26bcc\") " pod="openstack-operators/openstack-operator-index-mtwbr" Nov 25 16:11:42 crc kubenswrapper[4743]: I1125 16:11:42.571316 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:42 crc kubenswrapper[4743]: I1125 16:11:42.606826 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:42 crc kubenswrapper[4743]: I1125 16:11:42.607763 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mtwbr" Nov 25 16:11:42 crc kubenswrapper[4743]: I1125 16:11:42.792231 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mtwbr"] Nov 25 16:11:43 crc kubenswrapper[4743]: I1125 16:11:43.078990 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mtwbr" event={"ID":"089ebabb-d64e-45b1-acb3-038dc1e26bcc","Type":"ContainerStarted","Data":"d09da315bc24f570c4aadd504b9c4631495524d9261c9a84dca15872e6499322"} Nov 25 16:11:44 crc kubenswrapper[4743]: I1125 16:11:44.429341 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:44 crc kubenswrapper[4743]: I1125 16:11:44.429719 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:44 crc kubenswrapper[4743]: I1125 16:11:44.477969 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:45 crc kubenswrapper[4743]: I1125 16:11:45.139864 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:45 crc kubenswrapper[4743]: I1125 16:11:45.856964 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mtwbr"] Nov 25 16:11:46 crc kubenswrapper[4743]: I1125 16:11:46.096175 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mtwbr" event={"ID":"089ebabb-d64e-45b1-acb3-038dc1e26bcc","Type":"ContainerStarted","Data":"4ba3d09ae06b03c939671cc36b4051c541a4155826565f63d84b194c63abaeb9"} Nov 25 16:11:46 crc kubenswrapper[4743]: I1125 16:11:46.108876 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mtwbr" podStartSLOduration=1.963189085 podStartE2EDuration="4.108823434s" podCreationTimestamp="2025-11-25 16:11:42 +0000 UTC" firstStartedPulling="2025-11-25 16:11:42.806429238 +0000 UTC m=+781.928268787" lastFinishedPulling="2025-11-25 16:11:44.952063577 +0000 UTC m=+784.073903136" observedRunningTime="2025-11-25 16:11:46.107271075 +0000 UTC m=+785.229110694" watchObservedRunningTime="2025-11-25 16:11:46.108823434 +0000 UTC m=+785.230663003" Nov 25 16:11:46 crc kubenswrapper[4743]: I1125 16:11:46.664211 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-9gt7m"] Nov 25 16:11:46 crc kubenswrapper[4743]: I1125 16:11:46.664910 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9gt7m" Nov 25 16:11:46 crc kubenswrapper[4743]: I1125 16:11:46.673878 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9gt7m"] Nov 25 16:11:46 crc kubenswrapper[4743]: I1125 16:11:46.775778 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9fhf\" (UniqueName: \"kubernetes.io/projected/7413a348-450f-4717-a52f-595041381991-kube-api-access-c9fhf\") pod \"openstack-operator-index-9gt7m\" (UID: \"7413a348-450f-4717-a52f-595041381991\") " pod="openstack-operators/openstack-operator-index-9gt7m" Nov 25 16:11:46 crc kubenswrapper[4743]: I1125 16:11:46.877876 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9fhf\" (UniqueName: \"kubernetes.io/projected/7413a348-450f-4717-a52f-595041381991-kube-api-access-c9fhf\") pod \"openstack-operator-index-9gt7m\" (UID: \"7413a348-450f-4717-a52f-595041381991\") " pod="openstack-operators/openstack-operator-index-9gt7m" Nov 25 16:11:46 crc kubenswrapper[4743]: I1125 16:11:46.897544 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9fhf\" (UniqueName: \"kubernetes.io/projected/7413a348-450f-4717-a52f-595041381991-kube-api-access-c9fhf\") pod \"openstack-operator-index-9gt7m\" (UID: \"7413a348-450f-4717-a52f-595041381991\") " pod="openstack-operators/openstack-operator-index-9gt7m" Nov 25 16:11:47 crc kubenswrapper[4743]: I1125 16:11:47.000230 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9gt7m" Nov 25 16:11:47 crc kubenswrapper[4743]: I1125 16:11:47.105647 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-mtwbr" podUID="089ebabb-d64e-45b1-acb3-038dc1e26bcc" containerName="registry-server" containerID="cri-o://4ba3d09ae06b03c939671cc36b4051c541a4155826565f63d84b194c63abaeb9" gracePeriod=2 Nov 25 16:11:47 crc kubenswrapper[4743]: I1125 16:11:47.405496 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9gt7m"] Nov 25 16:11:47 crc kubenswrapper[4743]: W1125 16:11:47.409827 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7413a348_450f_4717_a52f_595041381991.slice/crio-e8c5996d8330a4232ea6fe033139838f958cb039573187223018d47878cc540b WatchSource:0}: Error finding container e8c5996d8330a4232ea6fe033139838f958cb039573187223018d47878cc540b: Status 404 returned error can't find the container with id e8c5996d8330a4232ea6fe033139838f958cb039573187223018d47878cc540b Nov 25 16:11:47 crc kubenswrapper[4743]: I1125 16:11:47.453196 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mtwbr" Nov 25 16:11:47 crc kubenswrapper[4743]: I1125 16:11:47.486166 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxz94\" (UniqueName: \"kubernetes.io/projected/089ebabb-d64e-45b1-acb3-038dc1e26bcc-kube-api-access-wxz94\") pod \"089ebabb-d64e-45b1-acb3-038dc1e26bcc\" (UID: \"089ebabb-d64e-45b1-acb3-038dc1e26bcc\") " Nov 25 16:11:47 crc kubenswrapper[4743]: I1125 16:11:47.493470 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/089ebabb-d64e-45b1-acb3-038dc1e26bcc-kube-api-access-wxz94" (OuterVolumeSpecName: "kube-api-access-wxz94") pod "089ebabb-d64e-45b1-acb3-038dc1e26bcc" (UID: "089ebabb-d64e-45b1-acb3-038dc1e26bcc"). InnerVolumeSpecName "kube-api-access-wxz94". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:11:47 crc kubenswrapper[4743]: I1125 16:11:47.573810 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-nv4nd" Nov 25 16:11:47 crc kubenswrapper[4743]: I1125 16:11:47.587823 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxz94\" (UniqueName: \"kubernetes.io/projected/089ebabb-d64e-45b1-acb3-038dc1e26bcc-kube-api-access-wxz94\") on node \"crc\" DevicePath \"\"" Nov 25 16:11:48 crc kubenswrapper[4743]: I1125 16:11:48.112582 4743 generic.go:334] "Generic (PLEG): container finished" podID="089ebabb-d64e-45b1-acb3-038dc1e26bcc" containerID="4ba3d09ae06b03c939671cc36b4051c541a4155826565f63d84b194c63abaeb9" exitCode=0 Nov 25 16:11:48 crc kubenswrapper[4743]: I1125 16:11:48.112973 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mtwbr" Nov 25 16:11:48 crc kubenswrapper[4743]: I1125 16:11:48.114213 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mtwbr" event={"ID":"089ebabb-d64e-45b1-acb3-038dc1e26bcc","Type":"ContainerDied","Data":"4ba3d09ae06b03c939671cc36b4051c541a4155826565f63d84b194c63abaeb9"} Nov 25 16:11:48 crc kubenswrapper[4743]: I1125 16:11:48.114275 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mtwbr" event={"ID":"089ebabb-d64e-45b1-acb3-038dc1e26bcc","Type":"ContainerDied","Data":"d09da315bc24f570c4aadd504b9c4631495524d9261c9a84dca15872e6499322"} Nov 25 16:11:48 crc kubenswrapper[4743]: I1125 16:11:48.114304 4743 scope.go:117] "RemoveContainer" containerID="4ba3d09ae06b03c939671cc36b4051c541a4155826565f63d84b194c63abaeb9" Nov 25 16:11:48 crc kubenswrapper[4743]: I1125 16:11:48.114327 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9gt7m" event={"ID":"7413a348-450f-4717-a52f-595041381991","Type":"ContainerStarted","Data":"96383fb03c429f724aba47e43a074cb9eb5c2eefecb2c92bbbd82d7d27d686be"} Nov 25 16:11:48 crc kubenswrapper[4743]: I1125 16:11:48.114366 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9gt7m" event={"ID":"7413a348-450f-4717-a52f-595041381991","Type":"ContainerStarted","Data":"e8c5996d8330a4232ea6fe033139838f958cb039573187223018d47878cc540b"} Nov 25 16:11:48 crc kubenswrapper[4743]: I1125 16:11:48.131233 4743 scope.go:117] "RemoveContainer" containerID="4ba3d09ae06b03c939671cc36b4051c541a4155826565f63d84b194c63abaeb9" Nov 25 16:11:48 crc kubenswrapper[4743]: I1125 16:11:48.131252 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mtwbr"] Nov 25 16:11:48 crc kubenswrapper[4743]: E1125 16:11:48.131729 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ba3d09ae06b03c939671cc36b4051c541a4155826565f63d84b194c63abaeb9\": container with ID starting with 4ba3d09ae06b03c939671cc36b4051c541a4155826565f63d84b194c63abaeb9 not found: ID does not exist" containerID="4ba3d09ae06b03c939671cc36b4051c541a4155826565f63d84b194c63abaeb9" Nov 25 16:11:48 crc kubenswrapper[4743]: I1125 16:11:48.131767 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ba3d09ae06b03c939671cc36b4051c541a4155826565f63d84b194c63abaeb9"} err="failed to get container status \"4ba3d09ae06b03c939671cc36b4051c541a4155826565f63d84b194c63abaeb9\": rpc error: code = NotFound desc = could not find container \"4ba3d09ae06b03c939671cc36b4051c541a4155826565f63d84b194c63abaeb9\": container with ID starting with 4ba3d09ae06b03c939671cc36b4051c541a4155826565f63d84b194c63abaeb9 not found: ID does not exist" Nov 25 16:11:48 crc kubenswrapper[4743]: I1125 16:11:48.135311 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-mtwbr"] Nov 25 16:11:48 crc kubenswrapper[4743]: I1125 16:11:48.140407 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-9gt7m" podStartSLOduration=2.085480985 podStartE2EDuration="2.140393043s" podCreationTimestamp="2025-11-25 16:11:46 +0000 UTC" firstStartedPulling="2025-11-25 16:11:47.415610547 +0000 UTC m=+786.537450096" lastFinishedPulling="2025-11-25 16:11:47.470522595 +0000 UTC m=+786.592362154" observedRunningTime="2025-11-25 16:11:48.140120514 +0000 UTC m=+787.261960103" watchObservedRunningTime="2025-11-25 16:11:48.140393043 +0000 UTC m=+787.262232592" Nov 25 16:11:48 crc kubenswrapper[4743]: I1125 16:11:48.193150 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-q2tm6" Nov 25 16:11:49 crc kubenswrapper[4743]: I1125 16:11:49.262941 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-phn76"] Nov 25 16:11:49 crc kubenswrapper[4743]: I1125 16:11:49.263225 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-phn76" podUID="f6db7da6-bc1b-44ee-8575-b5f468ec9f3e" containerName="registry-server" containerID="cri-o://af81397897206430ed842da4161d7276f4efff3fe8d2444a0a5275f1c5822d49" gracePeriod=2 Nov 25 16:11:49 crc kubenswrapper[4743]: I1125 16:11:49.621580 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:49 crc kubenswrapper[4743]: I1125 16:11:49.713232 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-catalog-content\") pod \"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e\" (UID: \"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e\") " Nov 25 16:11:49 crc kubenswrapper[4743]: I1125 16:11:49.713330 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jfgg\" (UniqueName: \"kubernetes.io/projected/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-kube-api-access-9jfgg\") pod \"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e\" (UID: \"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e\") " Nov 25 16:11:49 crc kubenswrapper[4743]: I1125 16:11:49.713392 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-utilities\") pod \"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e\" (UID: \"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e\") " Nov 25 16:11:49 crc kubenswrapper[4743]: I1125 16:11:49.714693 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-utilities" (OuterVolumeSpecName: "utilities") pod "f6db7da6-bc1b-44ee-8575-b5f468ec9f3e" (UID: "f6db7da6-bc1b-44ee-8575-b5f468ec9f3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:11:49 crc kubenswrapper[4743]: I1125 16:11:49.719293 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-kube-api-access-9jfgg" (OuterVolumeSpecName: "kube-api-access-9jfgg") pod "f6db7da6-bc1b-44ee-8575-b5f468ec9f3e" (UID: "f6db7da6-bc1b-44ee-8575-b5f468ec9f3e"). InnerVolumeSpecName "kube-api-access-9jfgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:11:49 crc kubenswrapper[4743]: I1125 16:11:49.760335 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6db7da6-bc1b-44ee-8575-b5f468ec9f3e" (UID: "f6db7da6-bc1b-44ee-8575-b5f468ec9f3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:11:49 crc kubenswrapper[4743]: I1125 16:11:49.782028 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="089ebabb-d64e-45b1-acb3-038dc1e26bcc" path="/var/lib/kubelet/pods/089ebabb-d64e-45b1-acb3-038dc1e26bcc/volumes" Nov 25 16:11:49 crc kubenswrapper[4743]: I1125 16:11:49.814721 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jfgg\" (UniqueName: \"kubernetes.io/projected/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-kube-api-access-9jfgg\") on node \"crc\" DevicePath \"\"" Nov 25 16:11:49 crc kubenswrapper[4743]: I1125 16:11:49.814757 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:11:49 crc kubenswrapper[4743]: I1125 16:11:49.814769 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:11:50 crc kubenswrapper[4743]: I1125 16:11:50.131643 4743 generic.go:334] "Generic (PLEG): container finished" podID="f6db7da6-bc1b-44ee-8575-b5f468ec9f3e" containerID="af81397897206430ed842da4161d7276f4efff3fe8d2444a0a5275f1c5822d49" exitCode=0 Nov 25 16:11:50 crc kubenswrapper[4743]: I1125 16:11:50.131699 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phn76" event={"ID":"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e","Type":"ContainerDied","Data":"af81397897206430ed842da4161d7276f4efff3fe8d2444a0a5275f1c5822d49"} Nov 25 16:11:50 crc kubenswrapper[4743]: I1125 16:11:50.131729 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phn76" event={"ID":"f6db7da6-bc1b-44ee-8575-b5f468ec9f3e","Type":"ContainerDied","Data":"07e7a324c7b560a16793b701f07e729339973735d9576775cb3f91a15b9b88a1"} Nov 25 16:11:50 crc kubenswrapper[4743]: I1125 16:11:50.131744 4743 scope.go:117] "RemoveContainer" containerID="af81397897206430ed842da4161d7276f4efff3fe8d2444a0a5275f1c5822d49" Nov 25 16:11:50 crc kubenswrapper[4743]: I1125 16:11:50.131681 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phn76" Nov 25 16:11:50 crc kubenswrapper[4743]: I1125 16:11:50.147273 4743 scope.go:117] "RemoveContainer" containerID="1e41441fbb1f96df61c913adf3cbf6b9d30a529d5ddc67fa17565e15da2a4b16" Nov 25 16:11:50 crc kubenswrapper[4743]: I1125 16:11:50.150862 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-phn76"] Nov 25 16:11:50 crc kubenswrapper[4743]: I1125 16:11:50.155098 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-phn76"] Nov 25 16:11:50 crc kubenswrapper[4743]: I1125 16:11:50.173807 4743 scope.go:117] "RemoveContainer" containerID="680ba710fc23b30239ced487e7bfdcf5a34e975f1f00d9e52e6c97f868939b06" Nov 25 16:11:50 crc kubenswrapper[4743]: I1125 16:11:50.197291 4743 scope.go:117] "RemoveContainer" containerID="af81397897206430ed842da4161d7276f4efff3fe8d2444a0a5275f1c5822d49" Nov 25 16:11:50 crc kubenswrapper[4743]: E1125 16:11:50.197763 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af81397897206430ed842da4161d7276f4efff3fe8d2444a0a5275f1c5822d49\": container with ID starting with af81397897206430ed842da4161d7276f4efff3fe8d2444a0a5275f1c5822d49 not found: ID does not exist" containerID="af81397897206430ed842da4161d7276f4efff3fe8d2444a0a5275f1c5822d49" Nov 25 16:11:50 crc kubenswrapper[4743]: I1125 16:11:50.197807 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af81397897206430ed842da4161d7276f4efff3fe8d2444a0a5275f1c5822d49"} err="failed to get container status \"af81397897206430ed842da4161d7276f4efff3fe8d2444a0a5275f1c5822d49\": rpc error: code = NotFound desc = could not find container \"af81397897206430ed842da4161d7276f4efff3fe8d2444a0a5275f1c5822d49\": container with ID starting with af81397897206430ed842da4161d7276f4efff3fe8d2444a0a5275f1c5822d49 not found: ID does not exist" Nov 25 16:11:50 crc kubenswrapper[4743]: I1125 16:11:50.197834 4743 scope.go:117] "RemoveContainer" containerID="1e41441fbb1f96df61c913adf3cbf6b9d30a529d5ddc67fa17565e15da2a4b16" Nov 25 16:11:50 crc kubenswrapper[4743]: E1125 16:11:50.198095 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e41441fbb1f96df61c913adf3cbf6b9d30a529d5ddc67fa17565e15da2a4b16\": container with ID starting with 1e41441fbb1f96df61c913adf3cbf6b9d30a529d5ddc67fa17565e15da2a4b16 not found: ID does not exist" containerID="1e41441fbb1f96df61c913adf3cbf6b9d30a529d5ddc67fa17565e15da2a4b16" Nov 25 16:11:50 crc kubenswrapper[4743]: I1125 16:11:50.198120 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e41441fbb1f96df61c913adf3cbf6b9d30a529d5ddc67fa17565e15da2a4b16"} err="failed to get container status \"1e41441fbb1f96df61c913adf3cbf6b9d30a529d5ddc67fa17565e15da2a4b16\": rpc error: code = NotFound desc = could not find container \"1e41441fbb1f96df61c913adf3cbf6b9d30a529d5ddc67fa17565e15da2a4b16\": container with ID starting with 1e41441fbb1f96df61c913adf3cbf6b9d30a529d5ddc67fa17565e15da2a4b16 not found: ID does not exist" Nov 25 16:11:50 crc kubenswrapper[4743]: I1125 16:11:50.198135 4743 scope.go:117] "RemoveContainer" containerID="680ba710fc23b30239ced487e7bfdcf5a34e975f1f00d9e52e6c97f868939b06" Nov 25 16:11:50 crc kubenswrapper[4743]: E1125 16:11:50.198455 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"680ba710fc23b30239ced487e7bfdcf5a34e975f1f00d9e52e6c97f868939b06\": container with ID starting with 680ba710fc23b30239ced487e7bfdcf5a34e975f1f00d9e52e6c97f868939b06 not found: ID does not exist" containerID="680ba710fc23b30239ced487e7bfdcf5a34e975f1f00d9e52e6c97f868939b06" Nov 25 16:11:50 crc kubenswrapper[4743]: I1125 16:11:50.198478 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680ba710fc23b30239ced487e7bfdcf5a34e975f1f00d9e52e6c97f868939b06"} err="failed to get container status \"680ba710fc23b30239ced487e7bfdcf5a34e975f1f00d9e52e6c97f868939b06\": rpc error: code = NotFound desc = could not find container \"680ba710fc23b30239ced487e7bfdcf5a34e975f1f00d9e52e6c97f868939b06\": container with ID starting with 680ba710fc23b30239ced487e7bfdcf5a34e975f1f00d9e52e6c97f868939b06 not found: ID does not exist" Nov 25 16:11:51 crc kubenswrapper[4743]: I1125 16:11:51.785318 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6db7da6-bc1b-44ee-8575-b5f468ec9f3e" path="/var/lib/kubelet/pods/f6db7da6-bc1b-44ee-8575-b5f468ec9f3e/volumes" Nov 25 16:11:57 crc kubenswrapper[4743]: I1125 16:11:57.001026 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-9gt7m" Nov 25 16:11:57 crc kubenswrapper[4743]: I1125 16:11:57.001574 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-9gt7m" Nov 25 16:11:57 crc kubenswrapper[4743]: I1125 16:11:57.032993 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-9gt7m" Nov 25 16:11:57 crc kubenswrapper[4743]: I1125 16:11:57.189773 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-9gt7m" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.304890 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd"] Nov 25 16:12:11 crc kubenswrapper[4743]: E1125 16:12:11.305792 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089ebabb-d64e-45b1-acb3-038dc1e26bcc" containerName="registry-server" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.305811 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="089ebabb-d64e-45b1-acb3-038dc1e26bcc" containerName="registry-server" Nov 25 16:12:11 crc kubenswrapper[4743]: E1125 16:12:11.305823 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6db7da6-bc1b-44ee-8575-b5f468ec9f3e" containerName="registry-server" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.305832 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6db7da6-bc1b-44ee-8575-b5f468ec9f3e" containerName="registry-server" Nov 25 16:12:11 crc kubenswrapper[4743]: E1125 16:12:11.305875 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6db7da6-bc1b-44ee-8575-b5f468ec9f3e" containerName="extract-content" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.305885 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6db7da6-bc1b-44ee-8575-b5f468ec9f3e" containerName="extract-content" Nov 25 16:12:11 crc kubenswrapper[4743]: E1125 16:12:11.305896 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6db7da6-bc1b-44ee-8575-b5f468ec9f3e" containerName="extract-utilities" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.305903 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6db7da6-bc1b-44ee-8575-b5f468ec9f3e" containerName="extract-utilities" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.306034 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="089ebabb-d64e-45b1-acb3-038dc1e26bcc" containerName="registry-server" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.306057 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6db7da6-bc1b-44ee-8575-b5f468ec9f3e" containerName="registry-server" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.307052 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.309066 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-j5ckp" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.317385 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd"] Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.404511 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab54bea3-befb-4d86-a499-806f480df7b0-util\") pod \"02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd\" (UID: \"ab54bea3-befb-4d86-a499-806f480df7b0\") " pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.404574 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mhnx\" (UniqueName: \"kubernetes.io/projected/ab54bea3-befb-4d86-a499-806f480df7b0-kube-api-access-8mhnx\") pod \"02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd\" (UID: \"ab54bea3-befb-4d86-a499-806f480df7b0\") " pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.404737 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab54bea3-befb-4d86-a499-806f480df7b0-bundle\") pod \"02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd\" (UID: \"ab54bea3-befb-4d86-a499-806f480df7b0\") " pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.506420 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab54bea3-befb-4d86-a499-806f480df7b0-util\") pod \"02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd\" (UID: \"ab54bea3-befb-4d86-a499-806f480df7b0\") " pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.506706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mhnx\" (UniqueName: \"kubernetes.io/projected/ab54bea3-befb-4d86-a499-806f480df7b0-kube-api-access-8mhnx\") pod \"02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd\" (UID: \"ab54bea3-befb-4d86-a499-806f480df7b0\") " pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.506805 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab54bea3-befb-4d86-a499-806f480df7b0-bundle\") pod \"02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd\" (UID: \"ab54bea3-befb-4d86-a499-806f480df7b0\") " pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.506984 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab54bea3-befb-4d86-a499-806f480df7b0-util\") pod \"02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd\" (UID: \"ab54bea3-befb-4d86-a499-806f480df7b0\") " pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.507322 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab54bea3-befb-4d86-a499-806f480df7b0-bundle\") pod \"02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd\" (UID: \"ab54bea3-befb-4d86-a499-806f480df7b0\") " pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.526728 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mhnx\" (UniqueName: \"kubernetes.io/projected/ab54bea3-befb-4d86-a499-806f480df7b0-kube-api-access-8mhnx\") pod \"02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd\" (UID: \"ab54bea3-befb-4d86-a499-806f480df7b0\") " pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" Nov 25 16:12:11 crc kubenswrapper[4743]: I1125 16:12:11.624367 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" Nov 25 16:12:12 crc kubenswrapper[4743]: I1125 16:12:12.015436 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd"] Nov 25 16:12:12 crc kubenswrapper[4743]: I1125 16:12:12.254230 4743 generic.go:334] "Generic (PLEG): container finished" podID="ab54bea3-befb-4d86-a499-806f480df7b0" containerID="ff768e16d608e98e50df039a6db3364fb1d0d20ed7e79efc9ac6a4751b2fba4e" exitCode=0 Nov 25 16:12:12 crc kubenswrapper[4743]: I1125 16:12:12.254293 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" event={"ID":"ab54bea3-befb-4d86-a499-806f480df7b0","Type":"ContainerDied","Data":"ff768e16d608e98e50df039a6db3364fb1d0d20ed7e79efc9ac6a4751b2fba4e"} Nov 25 16:12:12 crc kubenswrapper[4743]: I1125 16:12:12.254407 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" event={"ID":"ab54bea3-befb-4d86-a499-806f480df7b0","Type":"ContainerStarted","Data":"7196bfaaecbc5ceca22e58254027d8c657645d03dbb618cb2e380252b8668c2d"} Nov 25 16:12:13 crc kubenswrapper[4743]: I1125 16:12:13.262996 4743 generic.go:334] "Generic (PLEG): container finished" podID="ab54bea3-befb-4d86-a499-806f480df7b0" containerID="cb39038a8141c0e4cf503b29d16573761ecfcb2b1076516bb1fc115bc19fc783" exitCode=0 Nov 25 16:12:13 crc kubenswrapper[4743]: I1125 16:12:13.263084 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" event={"ID":"ab54bea3-befb-4d86-a499-806f480df7b0","Type":"ContainerDied","Data":"cb39038a8141c0e4cf503b29d16573761ecfcb2b1076516bb1fc115bc19fc783"} Nov 25 16:12:14 crc kubenswrapper[4743]: I1125 16:12:14.270877 4743 generic.go:334] "Generic (PLEG): container finished" podID="ab54bea3-befb-4d86-a499-806f480df7b0" containerID="02bb3bb549597efd77133c161e80a44bd3a780a9d5af977cde1d4576e1416d68" exitCode=0 Nov 25 16:12:14 crc kubenswrapper[4743]: I1125 16:12:14.270923 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" event={"ID":"ab54bea3-befb-4d86-a499-806f480df7b0","Type":"ContainerDied","Data":"02bb3bb549597efd77133c161e80a44bd3a780a9d5af977cde1d4576e1416d68"} Nov 25 16:12:15 crc kubenswrapper[4743]: I1125 16:12:15.580432 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" Nov 25 16:12:15 crc kubenswrapper[4743]: I1125 16:12:15.663436 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab54bea3-befb-4d86-a499-806f480df7b0-bundle\") pod \"ab54bea3-befb-4d86-a499-806f480df7b0\" (UID: \"ab54bea3-befb-4d86-a499-806f480df7b0\") " Nov 25 16:12:15 crc kubenswrapper[4743]: I1125 16:12:15.663558 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab54bea3-befb-4d86-a499-806f480df7b0-util\") pod \"ab54bea3-befb-4d86-a499-806f480df7b0\" (UID: \"ab54bea3-befb-4d86-a499-806f480df7b0\") " Nov 25 16:12:15 crc kubenswrapper[4743]: I1125 16:12:15.663638 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mhnx\" (UniqueName: \"kubernetes.io/projected/ab54bea3-befb-4d86-a499-806f480df7b0-kube-api-access-8mhnx\") pod \"ab54bea3-befb-4d86-a499-806f480df7b0\" (UID: \"ab54bea3-befb-4d86-a499-806f480df7b0\") " Nov 25 16:12:15 crc kubenswrapper[4743]: I1125 16:12:15.664494 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab54bea3-befb-4d86-a499-806f480df7b0-bundle" (OuterVolumeSpecName: "bundle") pod "ab54bea3-befb-4d86-a499-806f480df7b0" (UID: "ab54bea3-befb-4d86-a499-806f480df7b0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:12:15 crc kubenswrapper[4743]: I1125 16:12:15.668346 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab54bea3-befb-4d86-a499-806f480df7b0-kube-api-access-8mhnx" (OuterVolumeSpecName: "kube-api-access-8mhnx") pod "ab54bea3-befb-4d86-a499-806f480df7b0" (UID: "ab54bea3-befb-4d86-a499-806f480df7b0"). InnerVolumeSpecName "kube-api-access-8mhnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:12:15 crc kubenswrapper[4743]: I1125 16:12:15.679457 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab54bea3-befb-4d86-a499-806f480df7b0-util" (OuterVolumeSpecName: "util") pod "ab54bea3-befb-4d86-a499-806f480df7b0" (UID: "ab54bea3-befb-4d86-a499-806f480df7b0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:12:15 crc kubenswrapper[4743]: I1125 16:12:15.765551 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab54bea3-befb-4d86-a499-806f480df7b0-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:12:15 crc kubenswrapper[4743]: I1125 16:12:15.765627 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab54bea3-befb-4d86-a499-806f480df7b0-util\") on node \"crc\" DevicePath \"\"" Nov 25 16:12:15 crc kubenswrapper[4743]: I1125 16:12:15.765646 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mhnx\" (UniqueName: \"kubernetes.io/projected/ab54bea3-befb-4d86-a499-806f480df7b0-kube-api-access-8mhnx\") on node \"crc\" DevicePath \"\"" Nov 25 16:12:16 crc kubenswrapper[4743]: I1125 16:12:16.289786 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" event={"ID":"ab54bea3-befb-4d86-a499-806f480df7b0","Type":"ContainerDied","Data":"7196bfaaecbc5ceca22e58254027d8c657645d03dbb618cb2e380252b8668c2d"} Nov 25 16:12:16 crc kubenswrapper[4743]: I1125 16:12:16.289843 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7196bfaaecbc5ceca22e58254027d8c657645d03dbb618cb2e380252b8668c2d" Nov 25 16:12:16 crc kubenswrapper[4743]: I1125 16:12:16.289929 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd" Nov 25 16:12:18 crc kubenswrapper[4743]: I1125 16:12:18.649405 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-59fdcdbdd4-vrqql"] Nov 25 16:12:18 crc kubenswrapper[4743]: E1125 16:12:18.649961 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab54bea3-befb-4d86-a499-806f480df7b0" containerName="util" Nov 25 16:12:18 crc kubenswrapper[4743]: I1125 16:12:18.649976 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab54bea3-befb-4d86-a499-806f480df7b0" containerName="util" Nov 25 16:12:18 crc kubenswrapper[4743]: E1125 16:12:18.649988 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab54bea3-befb-4d86-a499-806f480df7b0" containerName="pull" Nov 25 16:12:18 crc kubenswrapper[4743]: I1125 16:12:18.649994 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab54bea3-befb-4d86-a499-806f480df7b0" containerName="pull" Nov 25 16:12:18 crc kubenswrapper[4743]: E1125 16:12:18.650002 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab54bea3-befb-4d86-a499-806f480df7b0" containerName="extract" Nov 25 16:12:18 crc kubenswrapper[4743]: I1125 16:12:18.650009 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab54bea3-befb-4d86-a499-806f480df7b0" containerName="extract" Nov 25 16:12:18 crc kubenswrapper[4743]: I1125 16:12:18.650117 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab54bea3-befb-4d86-a499-806f480df7b0" containerName="extract" Nov 25 16:12:18 crc kubenswrapper[4743]: I1125 16:12:18.650532 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-59fdcdbdd4-vrqql" Nov 25 16:12:18 crc kubenswrapper[4743]: I1125 16:12:18.657225 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-4jfnp" Nov 25 16:12:18 crc kubenswrapper[4743]: I1125 16:12:18.675645 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-59fdcdbdd4-vrqql"] Nov 25 16:12:18 crc kubenswrapper[4743]: I1125 16:12:18.704208 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdzkc\" (UniqueName: \"kubernetes.io/projected/e289302a-7d4a-4b30-94fe-5babb338505d-kube-api-access-tdzkc\") pod \"openstack-operator-controller-operator-59fdcdbdd4-vrqql\" (UID: \"e289302a-7d4a-4b30-94fe-5babb338505d\") " pod="openstack-operators/openstack-operator-controller-operator-59fdcdbdd4-vrqql" Nov 25 16:12:18 crc kubenswrapper[4743]: I1125 16:12:18.805370 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdzkc\" (UniqueName: \"kubernetes.io/projected/e289302a-7d4a-4b30-94fe-5babb338505d-kube-api-access-tdzkc\") pod \"openstack-operator-controller-operator-59fdcdbdd4-vrqql\" (UID: \"e289302a-7d4a-4b30-94fe-5babb338505d\") " pod="openstack-operators/openstack-operator-controller-operator-59fdcdbdd4-vrqql" Nov 25 16:12:18 crc kubenswrapper[4743]: I1125 16:12:18.821764 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdzkc\" (UniqueName: \"kubernetes.io/projected/e289302a-7d4a-4b30-94fe-5babb338505d-kube-api-access-tdzkc\") pod \"openstack-operator-controller-operator-59fdcdbdd4-vrqql\" (UID: \"e289302a-7d4a-4b30-94fe-5babb338505d\") " pod="openstack-operators/openstack-operator-controller-operator-59fdcdbdd4-vrqql" Nov 25 16:12:18 crc kubenswrapper[4743]: I1125 16:12:18.970309 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-59fdcdbdd4-vrqql" Nov 25 16:12:19 crc kubenswrapper[4743]: I1125 16:12:19.369284 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-59fdcdbdd4-vrqql"] Nov 25 16:12:19 crc kubenswrapper[4743]: W1125 16:12:19.373543 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode289302a_7d4a_4b30_94fe_5babb338505d.slice/crio-4d030164a49d24e364b67b875e78bf1070159b7d22e7e9e15328a54945e29110 WatchSource:0}: Error finding container 4d030164a49d24e364b67b875e78bf1070159b7d22e7e9e15328a54945e29110: Status 404 returned error can't find the container with id 4d030164a49d24e364b67b875e78bf1070159b7d22e7e9e15328a54945e29110 Nov 25 16:12:20 crc kubenswrapper[4743]: I1125 16:12:20.312520 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-59fdcdbdd4-vrqql" event={"ID":"e289302a-7d4a-4b30-94fe-5babb338505d","Type":"ContainerStarted","Data":"4d030164a49d24e364b67b875e78bf1070159b7d22e7e9e15328a54945e29110"} Nov 25 16:12:23 crc kubenswrapper[4743]: I1125 16:12:23.327889 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-59fdcdbdd4-vrqql" event={"ID":"e289302a-7d4a-4b30-94fe-5babb338505d","Type":"ContainerStarted","Data":"279970e27ef180c418c6166d09116ac1ad4bc3d6c68670fdbca59ca6fad2c147"} Nov 25 16:12:23 crc kubenswrapper[4743]: I1125 16:12:23.328475 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-59fdcdbdd4-vrqql" Nov 25 16:12:23 crc kubenswrapper[4743]: I1125 16:12:23.366708 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-59fdcdbdd4-vrqql" podStartSLOduration=2.228468827 podStartE2EDuration="5.366691295s" podCreationTimestamp="2025-11-25 16:12:18 +0000 UTC" firstStartedPulling="2025-11-25 16:12:19.375803008 +0000 UTC m=+818.497642557" lastFinishedPulling="2025-11-25 16:12:22.514025476 +0000 UTC m=+821.635865025" observedRunningTime="2025-11-25 16:12:23.359586643 +0000 UTC m=+822.481426212" watchObservedRunningTime="2025-11-25 16:12:23.366691295 +0000 UTC m=+822.488530844" Nov 25 16:12:28 crc kubenswrapper[4743]: I1125 16:12:28.973941 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-59fdcdbdd4-vrqql" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.471613 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-86dc4d89c8-82tqm"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.474799 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-82tqm" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.478407 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-6zwmh" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.480951 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79856dc55c-wk72z"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.481896 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wk72z" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.485826 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-t8kcm" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.489799 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-86dc4d89c8-82tqm"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.512583 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79856dc55c-wk72z"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.517736 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.518894 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.520384 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-c49l2" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.555071 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.563713 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-68b95954c9-q8p2b"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.564877 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-q8p2b" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.568969 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-c4q8q" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.576296 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-68b95954c9-q8p2b"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.597734 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-774b86978c-prmnw"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.598770 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-774b86978c-prmnw" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.601603 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-2tzqm" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.615732 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-774b86978c-prmnw"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.617346 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzcqp\" (UniqueName: \"kubernetes.io/projected/6a470e3c-9cac-463b-a253-308f3c386725-kube-api-access-jzcqp\") pod \"designate-operator-controller-manager-7d695c9b56-fwlrd\" (UID: \"6a470e3c-9cac-463b-a253-308f3c386725\") " pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.617562 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgzxx\" (UniqueName: \"kubernetes.io/projected/0729dc1e-3e2c-410e-892d-ef4773882665-kube-api-access-mgzxx\") pod \"cinder-operator-controller-manager-79856dc55c-wk72z\" (UID: \"0729dc1e-3e2c-410e-892d-ef4773882665\") " pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wk72z" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.620684 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bz4q\" (UniqueName: \"kubernetes.io/projected/9b77ce44-3830-488e-ac40-97af4d969f6e-kube-api-access-4bz4q\") pod \"barbican-operator-controller-manager-86dc4d89c8-82tqm\" (UID: \"9b77ce44-3830-488e-ac40-97af4d969f6e\") " pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-82tqm" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.632197 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c9694994-zq7mp"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.634517 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-zq7mp" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.635451 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f7g4h"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.636441 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f7g4h" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.639161 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-qhgmb" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.639317 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-jpxfn" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.641298 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.644138 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.650434 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.650443 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-bksg2" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.650906 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c9694994-zq7mp"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.668233 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f7g4h"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.672171 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.707272 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.708214 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.712091 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-djxpp"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.713116 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-djxpp" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.713543 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-rgltt" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.719322 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-jqpg8" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.719624 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-748dc6576f-z5rhv"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.721469 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-z5rhv" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.722090 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgzxx\" (UniqueName: \"kubernetes.io/projected/0729dc1e-3e2c-410e-892d-ef4773882665-kube-api-access-mgzxx\") pod \"cinder-operator-controller-manager-79856dc55c-wk72z\" (UID: \"0729dc1e-3e2c-410e-892d-ef4773882665\") " pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wk72z" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.722149 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqfb9\" (UniqueName: \"kubernetes.io/projected/288e97c2-c236-4177-9a52-bcf1c6c69faa-kube-api-access-mqfb9\") pod \"glance-operator-controller-manager-68b95954c9-q8p2b\" (UID: \"288e97c2-c236-4177-9a52-bcf1c6c69faa\") " pod="openstack-operators/glance-operator-controller-manager-68b95954c9-q8p2b" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.722171 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7775r\" (UniqueName: \"kubernetes.io/projected/38527a3c-d051-4354-a8ca-0692153762f1-kube-api-access-7775r\") pod \"ironic-operator-controller-manager-5bfcdc958c-f7g4h\" (UID: \"38527a3c-d051-4354-a8ca-0692153762f1\") " pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f7g4h" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.722202 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hns6g\" (UniqueName: \"kubernetes.io/projected/aebddcf8-77ce-4317-94c3-f29b45f93686-kube-api-access-hns6g\") pod \"horizon-operator-controller-manager-68c9694994-zq7mp\" (UID: \"aebddcf8-77ce-4317-94c3-f29b45f93686\") " pod="openstack-operators/horizon-operator-controller-manager-68c9694994-zq7mp" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.722272 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bz4q\" (UniqueName: \"kubernetes.io/projected/9b77ce44-3830-488e-ac40-97af4d969f6e-kube-api-access-4bz4q\") pod \"barbican-operator-controller-manager-86dc4d89c8-82tqm\" (UID: \"9b77ce44-3830-488e-ac40-97af4d969f6e\") " pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-82tqm" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.722294 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54r2w\" (UniqueName: \"kubernetes.io/projected/29690625-5e1d-417a-b0e5-9d74645b31f7-kube-api-access-54r2w\") pod \"heat-operator-controller-manager-774b86978c-prmnw\" (UID: \"29690625-5e1d-417a-b0e5-9d74645b31f7\") " pod="openstack-operators/heat-operator-controller-manager-774b86978c-prmnw" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.722343 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzcqp\" (UniqueName: \"kubernetes.io/projected/6a470e3c-9cac-463b-a253-308f3c386725-kube-api-access-jzcqp\") pod \"designate-operator-controller-manager-7d695c9b56-fwlrd\" (UID: \"6a470e3c-9cac-463b-a253-308f3c386725\") " pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.726479 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-f5q7t" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.732629 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.750725 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bz4q\" (UniqueName: \"kubernetes.io/projected/9b77ce44-3830-488e-ac40-97af4d969f6e-kube-api-access-4bz4q\") pod \"barbican-operator-controller-manager-86dc4d89c8-82tqm\" (UID: \"9b77ce44-3830-488e-ac40-97af4d969f6e\") " pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-82tqm" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.754355 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-748dc6576f-z5rhv"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.765662 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-djxpp"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.765726 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-j9zq7"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.766657 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-j9zq7" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.768414 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgzxx\" (UniqueName: \"kubernetes.io/projected/0729dc1e-3e2c-410e-892d-ef4773882665-kube-api-access-mgzxx\") pod \"cinder-operator-controller-manager-79856dc55c-wk72z\" (UID: \"0729dc1e-3e2c-410e-892d-ef4773882665\") " pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wk72z" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.771078 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-kjfwh" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.772299 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.775344 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.782951 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzcqp\" (UniqueName: \"kubernetes.io/projected/6a470e3c-9cac-463b-a253-308f3c386725-kube-api-access-jzcqp\") pod \"designate-operator-controller-manager-7d695c9b56-fwlrd\" (UID: \"6a470e3c-9cac-463b-a253-308f3c386725\") " pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.783545 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-k9tvb" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.803000 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-82tqm" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.813490 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wk72z" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.818570 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-j9zq7"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.824110 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4trsz\" (UniqueName: \"kubernetes.io/projected/bc109d32-7111-40b6-aff6-7596c933114f-kube-api-access-4trsz\") pod \"keystone-operator-controller-manager-748dc6576f-z5rhv\" (UID: \"bc109d32-7111-40b6-aff6-7596c933114f\") " pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-z5rhv" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.825748 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqfb9\" (UniqueName: \"kubernetes.io/projected/288e97c2-c236-4177-9a52-bcf1c6c69faa-kube-api-access-mqfb9\") pod \"glance-operator-controller-manager-68b95954c9-q8p2b\" (UID: \"288e97c2-c236-4177-9a52-bcf1c6c69faa\") " pod="openstack-operators/glance-operator-controller-manager-68b95954c9-q8p2b" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.825864 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc22p\" (UniqueName: \"kubernetes.io/projected/9f422105-6959-44e5-93e2-901fd9b84dfc-kube-api-access-sc22p\") pod \"mariadb-operator-controller-manager-cb6c4fdb7-djxpp\" (UID: \"9f422105-6959-44e5-93e2-901fd9b84dfc\") " pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-djxpp" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.825942 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7775r\" (UniqueName: \"kubernetes.io/projected/38527a3c-d051-4354-a8ca-0692153762f1-kube-api-access-7775r\") pod \"ironic-operator-controller-manager-5bfcdc958c-f7g4h\" (UID: \"38527a3c-d051-4354-a8ca-0692153762f1\") " pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f7g4h" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.826023 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hns6g\" (UniqueName: \"kubernetes.io/projected/aebddcf8-77ce-4317-94c3-f29b45f93686-kube-api-access-hns6g\") pod \"horizon-operator-controller-manager-68c9694994-zq7mp\" (UID: \"aebddcf8-77ce-4317-94c3-f29b45f93686\") " pod="openstack-operators/horizon-operator-controller-manager-68c9694994-zq7mp" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.826119 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54r2w\" (UniqueName: \"kubernetes.io/projected/29690625-5e1d-417a-b0e5-9d74645b31f7-kube-api-access-54r2w\") pod \"heat-operator-controller-manager-774b86978c-prmnw\" (UID: \"29690625-5e1d-417a-b0e5-9d74645b31f7\") " pod="openstack-operators/heat-operator-controller-manager-774b86978c-prmnw" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.826192 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z4tv\" (UniqueName: \"kubernetes.io/projected/9caca3f1-e43f-47ab-aa8a-1248a30cfda4-kube-api-access-7z4tv\") pod \"manila-operator-controller-manager-58bb8d67cc-8s47h\" (UID: \"9caca3f1-e43f-47ab-aa8a-1248a30cfda4\") " pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.826266 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d0cd465a-f903-48ef-aca1-839a390d3f12-cert\") pod \"infra-operator-controller-manager-d5cc86f4b-zdjj7\" (UID: \"d0cd465a-f903-48ef-aca1-839a390d3f12\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.826365 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6zqp\" (UniqueName: \"kubernetes.io/projected/d0cd465a-f903-48ef-aca1-839a390d3f12-kube-api-access-z6zqp\") pod \"infra-operator-controller-manager-d5cc86f4b-zdjj7\" (UID: \"d0cd465a-f903-48ef-aca1-839a390d3f12\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.827489 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.840959 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.853576 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54r2w\" (UniqueName: \"kubernetes.io/projected/29690625-5e1d-417a-b0e5-9d74645b31f7-kube-api-access-54r2w\") pod \"heat-operator-controller-manager-774b86978c-prmnw\" (UID: \"29690625-5e1d-417a-b0e5-9d74645b31f7\") " pod="openstack-operators/heat-operator-controller-manager-774b86978c-prmnw" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.862944 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-fd75fd47d-bvwmw"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.864159 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqfb9\" (UniqueName: \"kubernetes.io/projected/288e97c2-c236-4177-9a52-bcf1c6c69faa-kube-api-access-mqfb9\") pod \"glance-operator-controller-manager-68b95954c9-q8p2b\" (UID: \"288e97c2-c236-4177-9a52-bcf1c6c69faa\") " pod="openstack-operators/glance-operator-controller-manager-68b95954c9-q8p2b" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.864164 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-bvwmw" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.868046 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hns6g\" (UniqueName: \"kubernetes.io/projected/aebddcf8-77ce-4317-94c3-f29b45f93686-kube-api-access-hns6g\") pod \"horizon-operator-controller-manager-68c9694994-zq7mp\" (UID: \"aebddcf8-77ce-4317-94c3-f29b45f93686\") " pod="openstack-operators/horizon-operator-controller-manager-68c9694994-zq7mp" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.879231 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-298qq" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.885682 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-fd75fd47d-bvwmw"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.892498 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7775r\" (UniqueName: \"kubernetes.io/projected/38527a3c-d051-4354-a8ca-0692153762f1-kube-api-access-7775r\") pod \"ironic-operator-controller-manager-5bfcdc958c-f7g4h\" (UID: \"38527a3c-d051-4354-a8ca-0692153762f1\") " pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f7g4h" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.892689 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-q8p2b" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.927278 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6zqp\" (UniqueName: \"kubernetes.io/projected/d0cd465a-f903-48ef-aca1-839a390d3f12-kube-api-access-z6zqp\") pod \"infra-operator-controller-manager-d5cc86f4b-zdjj7\" (UID: \"d0cd465a-f903-48ef-aca1-839a390d3f12\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.927352 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncvts\" (UniqueName: \"kubernetes.io/projected/8d418847-cf8f-4977-bc14-3d4b64591e68-kube-api-access-ncvts\") pod \"neutron-operator-controller-manager-7c57c8bbc4-j9zq7\" (UID: \"8d418847-cf8f-4977-bc14-3d4b64591e68\") " pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-j9zq7" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.927386 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4trsz\" (UniqueName: \"kubernetes.io/projected/bc109d32-7111-40b6-aff6-7596c933114f-kube-api-access-4trsz\") pod \"keystone-operator-controller-manager-748dc6576f-z5rhv\" (UID: \"bc109d32-7111-40b6-aff6-7596c933114f\") " pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-z5rhv" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.927410 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rftx\" (UniqueName: \"kubernetes.io/projected/88757539-b3d4-4de5-bc96-a4cd13d5a203-kube-api-access-7rftx\") pod \"nova-operator-controller-manager-79556f57fc-bxdjg\" (UID: \"88757539-b3d4-4de5-bc96-a4cd13d5a203\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.927434 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc22p\" (UniqueName: \"kubernetes.io/projected/9f422105-6959-44e5-93e2-901fd9b84dfc-kube-api-access-sc22p\") pod \"mariadb-operator-controller-manager-cb6c4fdb7-djxpp\" (UID: \"9f422105-6959-44e5-93e2-901fd9b84dfc\") " pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-djxpp" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.927485 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z4tv\" (UniqueName: \"kubernetes.io/projected/9caca3f1-e43f-47ab-aa8a-1248a30cfda4-kube-api-access-7z4tv\") pod \"manila-operator-controller-manager-58bb8d67cc-8s47h\" (UID: \"9caca3f1-e43f-47ab-aa8a-1248a30cfda4\") " pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.927503 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d0cd465a-f903-48ef-aca1-839a390d3f12-cert\") pod \"infra-operator-controller-manager-d5cc86f4b-zdjj7\" (UID: \"d0cd465a-f903-48ef-aca1-839a390d3f12\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.927948 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-774b86978c-prmnw" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.928874 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.931987 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.934272 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d0cd465a-f903-48ef-aca1-839a390d3f12-cert\") pod \"infra-operator-controller-manager-d5cc86f4b-zdjj7\" (UID: \"d0cd465a-f903-48ef-aca1-839a390d3f12\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.939053 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.949501 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2qlz2" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.960535 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-zq7mp" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.965729 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.970166 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4trsz\" (UniqueName: \"kubernetes.io/projected/bc109d32-7111-40b6-aff6-7596c933114f-kube-api-access-4trsz\") pod \"keystone-operator-controller-manager-748dc6576f-z5rhv\" (UID: \"bc109d32-7111-40b6-aff6-7596c933114f\") " pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-z5rhv" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.971104 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6zqp\" (UniqueName: \"kubernetes.io/projected/d0cd465a-f903-48ef-aca1-839a390d3f12-kube-api-access-z6zqp\") pod \"infra-operator-controller-manager-d5cc86f4b-zdjj7\" (UID: \"d0cd465a-f903-48ef-aca1-839a390d3f12\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.975656 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z4tv\" (UniqueName: \"kubernetes.io/projected/9caca3f1-e43f-47ab-aa8a-1248a30cfda4-kube-api-access-7z4tv\") pod \"manila-operator-controller-manager-58bb8d67cc-8s47h\" (UID: \"9caca3f1-e43f-47ab-aa8a-1248a30cfda4\") " pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.976154 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc22p\" (UniqueName: \"kubernetes.io/projected/9f422105-6959-44e5-93e2-901fd9b84dfc-kube-api-access-sc22p\") pod \"mariadb-operator-controller-manager-cb6c4fdb7-djxpp\" (UID: \"9f422105-6959-44e5-93e2-901fd9b84dfc\") " pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-djxpp" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.984811 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f7g4h" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.984893 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-66cf5c67ff-mwm9p"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.985982 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-mwm9p" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.987282 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8"] Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.988210 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.988720 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-v7cdl" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.991148 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-2696m" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.998231 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" Nov 25 16:13:03 crc kubenswrapper[4743]: I1125 16:13:03.999122 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8"] Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.009536 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr"] Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.012703 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.017231 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-bbx6j" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.017445 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-66cf5c67ff-mwm9p"] Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.033131 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncvts\" (UniqueName: \"kubernetes.io/projected/8d418847-cf8f-4977-bc14-3d4b64591e68-kube-api-access-ncvts\") pod \"neutron-operator-controller-manager-7c57c8bbc4-j9zq7\" (UID: \"8d418847-cf8f-4977-bc14-3d4b64591e68\") " pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-j9zq7" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.033196 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfmrg\" (UniqueName: \"kubernetes.io/projected/94418bc2-d439-451f-91c2-c457a200825e-kube-api-access-qfmrg\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8\" (UID: \"94418bc2-d439-451f-91c2-c457a200825e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.033232 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94418bc2-d439-451f-91c2-c457a200825e-cert\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8\" (UID: \"94418bc2-d439-451f-91c2-c457a200825e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.033286 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rftx\" (UniqueName: \"kubernetes.io/projected/88757539-b3d4-4de5-bc96-a4cd13d5a203-kube-api-access-7rftx\") pod \"nova-operator-controller-manager-79556f57fc-bxdjg\" (UID: \"88757539-b3d4-4de5-bc96-a4cd13d5a203\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.033350 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cvhz\" (UniqueName: \"kubernetes.io/projected/44e2f27d-a5d4-48cf-90f5-2f5598a2295a-kube-api-access-4cvhz\") pod \"octavia-operator-controller-manager-fd75fd47d-bvwmw\" (UID: \"44e2f27d-a5d4-48cf-90f5-2f5598a2295a\") " pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-bvwmw" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.040936 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.059693 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr"] Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.070533 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncvts\" (UniqueName: \"kubernetes.io/projected/8d418847-cf8f-4977-bc14-3d4b64591e68-kube-api-access-ncvts\") pod \"neutron-operator-controller-manager-7c57c8bbc4-j9zq7\" (UID: \"8d418847-cf8f-4977-bc14-3d4b64591e68\") " pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-j9zq7" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.074043 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rftx\" (UniqueName: \"kubernetes.io/projected/88757539-b3d4-4de5-bc96-a4cd13d5a203-kube-api-access-7rftx\") pod \"nova-operator-controller-manager-79556f57fc-bxdjg\" (UID: \"88757539-b3d4-4de5-bc96-a4cd13d5a203\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.082993 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8"] Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.084341 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.105992 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-v2c2l" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.112987 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8"] Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.113219 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-djxpp" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.138361 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrm5j\" (UniqueName: \"kubernetes.io/projected/e2417720-74c0-4232-9f99-cdc10e485c91-kube-api-access-hrm5j\") pod \"swift-operator-controller-manager-6fdc4fcf86-smsbr\" (UID: \"e2417720-74c0-4232-9f99-cdc10e485c91\") " pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.138421 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94418bc2-d439-451f-91c2-c457a200825e-cert\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8\" (UID: \"94418bc2-d439-451f-91c2-c457a200825e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.138484 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvhz\" (UniqueName: \"kubernetes.io/projected/44e2f27d-a5d4-48cf-90f5-2f5598a2295a-kube-api-access-4cvhz\") pod \"octavia-operator-controller-manager-fd75fd47d-bvwmw\" (UID: \"44e2f27d-a5d4-48cf-90f5-2f5598a2295a\") " pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-bvwmw" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.138519 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4mtc\" (UniqueName: \"kubernetes.io/projected/7be9f6fc-3582-4e14-a452-daa24035d10e-kube-api-access-s4mtc\") pod \"placement-operator-controller-manager-5db546f9d9-g5bp8\" (UID: \"7be9f6fc-3582-4e14-a452-daa24035d10e\") " pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.138556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56qww\" (UniqueName: \"kubernetes.io/projected/e0a0f65a-b18b-479e-8ef8-5f7c6c36ccdf-kube-api-access-56qww\") pod \"ovn-operator-controller-manager-66cf5c67ff-mwm9p\" (UID: \"e0a0f65a-b18b-479e-8ef8-5f7c6c36ccdf\") " pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-mwm9p" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.138651 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfmrg\" (UniqueName: \"kubernetes.io/projected/94418bc2-d439-451f-91c2-c457a200825e-kube-api-access-qfmrg\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8\" (UID: \"94418bc2-d439-451f-91c2-c457a200825e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" Nov 25 16:13:04 crc kubenswrapper[4743]: E1125 16:13:04.139006 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 16:13:04 crc kubenswrapper[4743]: E1125 16:13:04.139043 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94418bc2-d439-451f-91c2-c457a200825e-cert podName:94418bc2-d439-451f-91c2-c457a200825e nodeName:}" failed. No retries permitted until 2025-11-25 16:13:04.639028127 +0000 UTC m=+863.760867676 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94418bc2-d439-451f-91c2-c457a200825e-cert") pod "openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" (UID: "94418bc2-d439-451f-91c2-c457a200825e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.151392 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh"] Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.164110 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.166704 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-62n45" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.173189 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfmrg\" (UniqueName: \"kubernetes.io/projected/94418bc2-d439-451f-91c2-c457a200825e-kube-api-access-qfmrg\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8\" (UID: \"94418bc2-d439-451f-91c2-c457a200825e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.179265 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cvhz\" (UniqueName: \"kubernetes.io/projected/44e2f27d-a5d4-48cf-90f5-2f5598a2295a-kube-api-access-4cvhz\") pod \"octavia-operator-controller-manager-fd75fd47d-bvwmw\" (UID: \"44e2f27d-a5d4-48cf-90f5-2f5598a2295a\") " pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-bvwmw" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.179385 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-z5rhv" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.200418 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-j9zq7" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.208768 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh"] Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.225354 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.251212 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrm5j\" (UniqueName: \"kubernetes.io/projected/e2417720-74c0-4232-9f99-cdc10e485c91-kube-api-access-hrm5j\") pod \"swift-operator-controller-manager-6fdc4fcf86-smsbr\" (UID: \"e2417720-74c0-4232-9f99-cdc10e485c91\") " pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.251296 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4mtc\" (UniqueName: \"kubernetes.io/projected/7be9f6fc-3582-4e14-a452-daa24035d10e-kube-api-access-s4mtc\") pod \"placement-operator-controller-manager-5db546f9d9-g5bp8\" (UID: \"7be9f6fc-3582-4e14-a452-daa24035d10e\") " pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.251327 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc6v9\" (UniqueName: \"kubernetes.io/projected/05a605e3-814f-45a4-8461-47cbb3330652-kube-api-access-lc6v9\") pod \"test-operator-controller-manager-5cb74df96-xbxjh\" (UID: \"05a605e3-814f-45a4-8461-47cbb3330652\") " pod="openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.251359 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56qww\" (UniqueName: \"kubernetes.io/projected/e0a0f65a-b18b-479e-8ef8-5f7c6c36ccdf-kube-api-access-56qww\") pod \"ovn-operator-controller-manager-66cf5c67ff-mwm9p\" (UID: \"e0a0f65a-b18b-479e-8ef8-5f7c6c36ccdf\") " pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-mwm9p" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.251375 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whwm4\" (UniqueName: \"kubernetes.io/projected/d4e33a37-ac1e-408c-b0d3-a1352daa67af-kube-api-access-whwm4\") pod \"telemetry-operator-controller-manager-567f98c9d-8lsj8\" (UID: \"d4e33a37-ac1e-408c-b0d3-a1352daa67af\") " pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.251514 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-bvwmw" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.260918 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-864885998-pjmtx"] Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.262434 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-864885998-pjmtx" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.266803 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-sff6l" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.281675 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-864885998-pjmtx"] Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.290496 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrm5j\" (UniqueName: \"kubernetes.io/projected/e2417720-74c0-4232-9f99-cdc10e485c91-kube-api-access-hrm5j\") pod \"swift-operator-controller-manager-6fdc4fcf86-smsbr\" (UID: \"e2417720-74c0-4232-9f99-cdc10e485c91\") " pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.294243 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4mtc\" (UniqueName: \"kubernetes.io/projected/7be9f6fc-3582-4e14-a452-daa24035d10e-kube-api-access-s4mtc\") pod \"placement-operator-controller-manager-5db546f9d9-g5bp8\" (UID: \"7be9f6fc-3582-4e14-a452-daa24035d10e\") " pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.295535 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56qww\" (UniqueName: \"kubernetes.io/projected/e0a0f65a-b18b-479e-8ef8-5f7c6c36ccdf-kube-api-access-56qww\") pod \"ovn-operator-controller-manager-66cf5c67ff-mwm9p\" (UID: \"e0a0f65a-b18b-479e-8ef8-5f7c6c36ccdf\") " pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-mwm9p" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.305772 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7"] Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.306989 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.311336 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7"] Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.312496 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.312703 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.312868 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vlg2l" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.313725 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-mwm9p" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.332789 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.336662 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.349708 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g4sq7"] Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.351585 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g4sq7"] Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.351673 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g4sq7" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.352077 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc6v9\" (UniqueName: \"kubernetes.io/projected/05a605e3-814f-45a4-8461-47cbb3330652-kube-api-access-lc6v9\") pod \"test-operator-controller-manager-5cb74df96-xbxjh\" (UID: \"05a605e3-814f-45a4-8461-47cbb3330652\") " pod="openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.352122 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whwm4\" (UniqueName: \"kubernetes.io/projected/d4e33a37-ac1e-408c-b0d3-a1352daa67af-kube-api-access-whwm4\") pod \"telemetry-operator-controller-manager-567f98c9d-8lsj8\" (UID: \"d4e33a37-ac1e-408c-b0d3-a1352daa67af\") " pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.352190 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzqdn\" (UniqueName: \"kubernetes.io/projected/5a86bde8-f04d-4bfa-842f-6c960d7232fb-kube-api-access-hzqdn\") pod \"watcher-operator-controller-manager-864885998-pjmtx\" (UID: \"5a86bde8-f04d-4bfa-842f-6c960d7232fb\") " pod="openstack-operators/watcher-operator-controller-manager-864885998-pjmtx" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.353564 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6n8mb" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.369547 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc6v9\" (UniqueName: \"kubernetes.io/projected/05a605e3-814f-45a4-8461-47cbb3330652-kube-api-access-lc6v9\") pod \"test-operator-controller-manager-5cb74df96-xbxjh\" (UID: \"05a605e3-814f-45a4-8461-47cbb3330652\") " pod="openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.376427 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whwm4\" (UniqueName: \"kubernetes.io/projected/d4e33a37-ac1e-408c-b0d3-a1352daa67af-kube-api-access-whwm4\") pod \"telemetry-operator-controller-manager-567f98c9d-8lsj8\" (UID: \"d4e33a37-ac1e-408c-b0d3-a1352daa67af\") " pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.381169 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.410222 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.455221 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr54v\" (UniqueName: \"kubernetes.io/projected/20c829b2-be6f-4f96-85c1-21279d871c99-kube-api-access-cr54v\") pod \"openstack-operator-controller-manager-746c9d5b4f-z2hm7\" (UID: \"20c829b2-be6f-4f96-85c1-21279d871c99\") " pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.455608 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzqdn\" (UniqueName: \"kubernetes.io/projected/5a86bde8-f04d-4bfa-842f-6c960d7232fb-kube-api-access-hzqdn\") pod \"watcher-operator-controller-manager-864885998-pjmtx\" (UID: \"5a86bde8-f04d-4bfa-842f-6c960d7232fb\") " pod="openstack-operators/watcher-operator-controller-manager-864885998-pjmtx" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.455635 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-metrics-certs\") pod \"openstack-operator-controller-manager-746c9d5b4f-z2hm7\" (UID: \"20c829b2-be6f-4f96-85c1-21279d871c99\") " pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.455698 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkjh6\" (UniqueName: \"kubernetes.io/projected/66ead12f-65d1-4438-80b0-1a747105d7fc-kube-api-access-hkjh6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-g4sq7\" (UID: \"66ead12f-65d1-4438-80b0-1a747105d7fc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g4sq7" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.455715 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-webhook-certs\") pod \"openstack-operator-controller-manager-746c9d5b4f-z2hm7\" (UID: \"20c829b2-be6f-4f96-85c1-21279d871c99\") " pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.493154 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzqdn\" (UniqueName: \"kubernetes.io/projected/5a86bde8-f04d-4bfa-842f-6c960d7232fb-kube-api-access-hzqdn\") pod \"watcher-operator-controller-manager-864885998-pjmtx\" (UID: \"5a86bde8-f04d-4bfa-842f-6c960d7232fb\") " pod="openstack-operators/watcher-operator-controller-manager-864885998-pjmtx" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.561573 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkjh6\" (UniqueName: \"kubernetes.io/projected/66ead12f-65d1-4438-80b0-1a747105d7fc-kube-api-access-hkjh6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-g4sq7\" (UID: \"66ead12f-65d1-4438-80b0-1a747105d7fc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g4sq7" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.561698 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-webhook-certs\") pod \"openstack-operator-controller-manager-746c9d5b4f-z2hm7\" (UID: \"20c829b2-be6f-4f96-85c1-21279d871c99\") " pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.561757 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr54v\" (UniqueName: \"kubernetes.io/projected/20c829b2-be6f-4f96-85c1-21279d871c99-kube-api-access-cr54v\") pod \"openstack-operator-controller-manager-746c9d5b4f-z2hm7\" (UID: \"20c829b2-be6f-4f96-85c1-21279d871c99\") " pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.561854 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-metrics-certs\") pod \"openstack-operator-controller-manager-746c9d5b4f-z2hm7\" (UID: \"20c829b2-be6f-4f96-85c1-21279d871c99\") " pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:04 crc kubenswrapper[4743]: E1125 16:13:04.562074 4743 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 25 16:13:04 crc kubenswrapper[4743]: E1125 16:13:04.562158 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-metrics-certs podName:20c829b2-be6f-4f96-85c1-21279d871c99 nodeName:}" failed. No retries permitted until 2025-11-25 16:13:05.062117172 +0000 UTC m=+864.183956721 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-metrics-certs") pod "openstack-operator-controller-manager-746c9d5b4f-z2hm7" (UID: "20c829b2-be6f-4f96-85c1-21279d871c99") : secret "metrics-server-cert" not found Nov 25 16:13:04 crc kubenswrapper[4743]: E1125 16:13:04.562744 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 25 16:13:04 crc kubenswrapper[4743]: E1125 16:13:04.562829 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-webhook-certs podName:20c829b2-be6f-4f96-85c1-21279d871c99 nodeName:}" failed. No retries permitted until 2025-11-25 16:13:05.062808554 +0000 UTC m=+864.184648163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-webhook-certs") pod "openstack-operator-controller-manager-746c9d5b4f-z2hm7" (UID: "20c829b2-be6f-4f96-85c1-21279d871c99") : secret "webhook-server-cert" not found Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.596228 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkjh6\" (UniqueName: \"kubernetes.io/projected/66ead12f-65d1-4438-80b0-1a747105d7fc-kube-api-access-hkjh6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-g4sq7\" (UID: \"66ead12f-65d1-4438-80b0-1a747105d7fc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g4sq7" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.596309 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr54v\" (UniqueName: \"kubernetes.io/projected/20c829b2-be6f-4f96-85c1-21279d871c99-kube-api-access-cr54v\") pod \"openstack-operator-controller-manager-746c9d5b4f-z2hm7\" (UID: \"20c829b2-be6f-4f96-85c1-21279d871c99\") " pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.664979 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94418bc2-d439-451f-91c2-c457a200825e-cert\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8\" (UID: \"94418bc2-d439-451f-91c2-c457a200825e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" Nov 25 16:13:04 crc kubenswrapper[4743]: E1125 16:13:04.665235 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 16:13:04 crc kubenswrapper[4743]: E1125 16:13:04.665369 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94418bc2-d439-451f-91c2-c457a200825e-cert podName:94418bc2-d439-451f-91c2-c457a200825e nodeName:}" failed. No retries permitted until 2025-11-25 16:13:05.665341387 +0000 UTC m=+864.787180976 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94418bc2-d439-451f-91c2-c457a200825e-cert") pod "openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" (UID: "94418bc2-d439-451f-91c2-c457a200825e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.756832 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-864885998-pjmtx" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.829959 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g4sq7" Nov 25 16:13:04 crc kubenswrapper[4743]: I1125 16:13:04.978767 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-86dc4d89c8-82tqm"] Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.015325 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-68b95954c9-q8p2b"] Nov 25 16:13:05 crc kubenswrapper[4743]: W1125 16:13:05.019458 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod288e97c2_c236_4177_9a52_bcf1c6c69faa.slice/crio-23596bab9c9c01cfe12091d3c8ba2cddec13cb349e50c1aa110ae25301467079 WatchSource:0}: Error finding container 23596bab9c9c01cfe12091d3c8ba2cddec13cb349e50c1aa110ae25301467079: Status 404 returned error can't find the container with id 23596bab9c9c01cfe12091d3c8ba2cddec13cb349e50c1aa110ae25301467079 Nov 25 16:13:05 crc kubenswrapper[4743]: W1125 16:13:05.026643 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a470e3c_9cac_463b_a253_308f3c386725.slice/crio-69b7ec915d9e7786ddacc21523eeeba0aec5f6e6dc260f87a589c5c0c9b8f302 WatchSource:0}: Error finding container 69b7ec915d9e7786ddacc21523eeeba0aec5f6e6dc260f87a589c5c0c9b8f302: Status 404 returned error can't find the container with id 69b7ec915d9e7786ddacc21523eeeba0aec5f6e6dc260f87a589c5c0c9b8f302 Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.030492 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd"] Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.072561 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-metrics-certs\") pod \"openstack-operator-controller-manager-746c9d5b4f-z2hm7\" (UID: \"20c829b2-be6f-4f96-85c1-21279d871c99\") " pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.072668 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-webhook-certs\") pod \"openstack-operator-controller-manager-746c9d5b4f-z2hm7\" (UID: \"20c829b2-be6f-4f96-85c1-21279d871c99\") " pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.072900 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.072960 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-webhook-certs podName:20c829b2-be6f-4f96-85c1-21279d871c99 nodeName:}" failed. No retries permitted until 2025-11-25 16:13:06.072943088 +0000 UTC m=+865.194782637 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-webhook-certs") pod "openstack-operator-controller-manager-746c9d5b4f-z2hm7" (UID: "20c829b2-be6f-4f96-85c1-21279d871c99") : secret "webhook-server-cert" not found Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.082657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-metrics-certs\") pod \"openstack-operator-controller-manager-746c9d5b4f-z2hm7\" (UID: \"20c829b2-be6f-4f96-85c1-21279d871c99\") " pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.145470 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79856dc55c-wk72z"] Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.153267 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-748dc6576f-z5rhv"] Nov 25 16:13:05 crc kubenswrapper[4743]: W1125 16:13:05.158474 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc109d32_7111_40b6_aff6_7596c933114f.slice/crio-2d9a0f9bffa3f21bc54b572854744830aa0bc80e13f10ca6bc5b988e560e3200 WatchSource:0}: Error finding container 2d9a0f9bffa3f21bc54b572854744830aa0bc80e13f10ca6bc5b988e560e3200: Status 404 returned error can't find the container with id 2d9a0f9bffa3f21bc54b572854744830aa0bc80e13f10ca6bc5b988e560e3200 Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.172185 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c9694994-zq7mp"] Nov 25 16:13:05 crc kubenswrapper[4743]: W1125 16:13:05.174975 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38527a3c_d051_4354_a8ca_0692153762f1.slice/crio-7e77d29e73e5021a2f8efc0dc7b1f3b5a69587c249bddd3643ab0149a2f9f4ae WatchSource:0}: Error finding container 7e77d29e73e5021a2f8efc0dc7b1f3b5a69587c249bddd3643ab0149a2f9f4ae: Status 404 returned error can't find the container with id 7e77d29e73e5021a2f8efc0dc7b1f3b5a69587c249bddd3643ab0149a2f9f4ae Nov 25 16:13:05 crc kubenswrapper[4743]: W1125 16:13:05.176579 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29690625_5e1d_417a_b0e5_9d74645b31f7.slice/crio-539af94b59965eef233c275f5603868048ae53cc024335b35f3ad9ea1a7d2186 WatchSource:0}: Error finding container 539af94b59965eef233c275f5603868048ae53cc024335b35f3ad9ea1a7d2186: Status 404 returned error can't find the container with id 539af94b59965eef233c275f5603868048ae53cc024335b35f3ad9ea1a7d2186 Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.178873 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f7g4h"] Nov 25 16:13:05 crc kubenswrapper[4743]: W1125 16:13:05.179586 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44e2f27d_a5d4_48cf_90f5_2f5598a2295a.slice/crio-7a4922aae657c83d7419ec32dbf8d16bb39e7ae8d168f9d24927b229e8da04c8 WatchSource:0}: Error finding container 7a4922aae657c83d7419ec32dbf8d16bb39e7ae8d168f9d24927b229e8da04c8: Status 404 returned error can't find the container with id 7a4922aae657c83d7419ec32dbf8d16bb39e7ae8d168f9d24927b229e8da04c8 Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.186168 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-774b86978c-prmnw"] Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.192125 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-fd75fd47d-bvwmw"] Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.540696 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-66cf5c67ff-mwm9p"] Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.553255 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-j9zq7"] Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.559736 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7"] Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.566611 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr"] Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.571410 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-djxpp"] Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.579297 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g4sq7"] Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.592398 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg"] Nov 25 16:13:05 crc kubenswrapper[4743]: W1125 16:13:05.596766 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f422105_6959_44e5_93e2_901fd9b84dfc.slice/crio-bdb3e714d946b35fcc25642940b1289987767ad4f91a76e56b5ae23f74fb3fe4 WatchSource:0}: Error finding container bdb3e714d946b35fcc25642940b1289987767ad4f91a76e56b5ae23f74fb3fe4: Status 404 returned error can't find the container with id bdb3e714d946b35fcc25642940b1289987767ad4f91a76e56b5ae23f74fb3fe4 Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.597235 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h"] Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.597293 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-j9zq7" event={"ID":"8d418847-cf8f-4977-bc14-3d4b64591e68","Type":"ContainerStarted","Data":"1361babda5049ab5e224fa13e24be31a6c2644a4ad919b79036646ee6f0e3fae"} Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.600664 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-82tqm" event={"ID":"9b77ce44-3830-488e-ac40-97af4d969f6e","Type":"ContainerStarted","Data":"6ed3f5c92ec398be1242328a2fdbcf3e76d752d4104e7976ae375a49ad02a966"} Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.601927 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-zq7mp" event={"ID":"aebddcf8-77ce-4317-94c3-f29b45f93686","Type":"ContainerStarted","Data":"ee2ed971da0a92c58a911bfc859ca2a822d764eedc64785f86a2a009f2837dca"} Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.604347 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c0b5f124a37c1538042c0e63f0978429572e2a851d7f3a6eb80de09b86d755a0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hrm5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6fdc4fcf86-smsbr_openstack-operators(e2417720-74c0-4232-9f99-cdc10e485c91): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.604722 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8"] Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.610547 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hrm5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6fdc4fcf86-smsbr_openstack-operators(e2417720-74c0-4232-9f99-cdc10e485c91): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.610902 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh"] Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.611626 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-z5rhv" event={"ID":"bc109d32-7111-40b6-aff6-7596c933114f","Type":"ContainerStarted","Data":"2d9a0f9bffa3f21bc54b572854744830aa0bc80e13f10ca6bc5b988e560e3200"} Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.611850 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr" podUID="e2417720-74c0-4232-9f99-cdc10e485c91" Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.612322 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:b749a5dd8bc718875c3f5e81b38d54d003be77ab92de4a3e9f9595566496a58a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7z4tv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-58bb8d67cc-8s47h_openstack-operators(9caca3f1-e43f-47ab-aa8a-1248a30cfda4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.613689 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s4mtc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5db546f9d9-g5bp8_openstack-operators(7be9f6fc-3582-4e14-a452-daa24035d10e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.613831 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wk72z" event={"ID":"0729dc1e-3e2c-410e-892d-ef4773882665","Type":"ContainerStarted","Data":"85f4add957cb3a4032ababf97d35ff7b1ad9898c5fa19c8f85ee774a29648cbe"} Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.614345 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7z4tv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-58bb8d67cc-8s47h_openstack-operators(9caca3f1-e43f-47ab-aa8a-1248a30cfda4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.615818 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h" podUID="9caca3f1-e43f-47ab-aa8a-1248a30cfda4" Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.618057 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8"] Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.618826 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s4mtc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5db546f9d9-g5bp8_openstack-operators(7be9f6fc-3582-4e14-a452-daa24035d10e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:05 crc kubenswrapper[4743]: W1125 16:13:05.618862 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88757539_b3d4_4de5_bc96_a4cd13d5a203.slice/crio-2997316950df81c7ee82c300c51a9427c7afab706fe83ef39229a1aa132e0e4d WatchSource:0}: Error finding container 2997316950df81c7ee82c300c51a9427c7afab706fe83ef39229a1aa132e0e4d: Status 404 returned error can't find the container with id 2997316950df81c7ee82c300c51a9427c7afab706fe83ef39229a1aa132e0e4d Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.618874 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-bvwmw" event={"ID":"44e2f27d-a5d4-48cf-90f5-2f5598a2295a","Type":"ContainerStarted","Data":"7a4922aae657c83d7419ec32dbf8d16bb39e7ae8d168f9d24927b229e8da04c8"} Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.620540 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8" podUID="7be9f6fc-3582-4e14-a452-daa24035d10e" Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.620985 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-whwm4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-567f98c9d-8lsj8_openstack-operators(d4e33a37-ac1e-408c-b0d3-a1352daa67af): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.622368 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f7g4h" event={"ID":"38527a3c-d051-4354-a8ca-0692153762f1","Type":"ContainerStarted","Data":"7e77d29e73e5021a2f8efc0dc7b1f3b5a69587c249bddd3643ab0149a2f9f4ae"} Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.623623 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-864885998-pjmtx"] Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.625250 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:c053e34316044f14929e16e4f0d97f9f1b24cb68b5e22b925ca74c66aaaed0a7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7rftx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-79556f57fc-bxdjg_openstack-operators(88757539-b3d4-4de5-bc96-a4cd13d5a203): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:05 crc kubenswrapper[4743]: W1125 16:13:05.625500 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66ead12f_65d1_4438_80b0_1a747105d7fc.slice/crio-4b1443fdf5321bbb7ff98e09a094576ea07dbdbe0ea359d30f5fc63e632f62d2 WatchSource:0}: Error finding container 4b1443fdf5321bbb7ff98e09a094576ea07dbdbe0ea359d30f5fc63e632f62d2: Status 404 returned error can't find the container with id 4b1443fdf5321bbb7ff98e09a094576ea07dbdbe0ea359d30f5fc63e632f62d2 Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.627266 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-prmnw" event={"ID":"29690625-5e1d-417a-b0e5-9d74645b31f7","Type":"ContainerStarted","Data":"539af94b59965eef233c275f5603868048ae53cc024335b35f3ad9ea1a7d2186"} Nov 25 16:13:05 crc kubenswrapper[4743]: W1125 16:13:05.628738 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a86bde8_f04d_4bfa_842f_6c960d7232fb.slice/crio-43c9222432737e0be9d5d827b59e90a0fdba0ac4a23037c87e7bfc09a199e32d WatchSource:0}: Error finding container 43c9222432737e0be9d5d827b59e90a0fdba0ac4a23037c87e7bfc09a199e32d: Status 404 returned error can't find the container with id 43c9222432737e0be9d5d827b59e90a0fdba0ac4a23037c87e7bfc09a199e32d Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.630283 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-mwm9p" event={"ID":"e0a0f65a-b18b-479e-8ef8-5f7c6c36ccdf","Type":"ContainerStarted","Data":"58cfb2f6c65a458b7118c86c596405a9fe3a6e2b52c0554c770fd24405eb9dfc"} Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.632269 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lc6v9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cb74df96-xbxjh_openstack-operators(05a605e3-814f-45a4-8461-47cbb3330652): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.632478 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hzqdn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-864885998-pjmtx_openstack-operators(5a86bde8-f04d-4bfa-842f-6c960d7232fb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.632692 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-whwm4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-567f98c9d-8lsj8_openstack-operators(d4e33a37-ac1e-408c-b0d3-a1352daa67af): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.632873 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7rftx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-79556f57fc-bxdjg_openstack-operators(88757539-b3d4-4de5-bc96-a4cd13d5a203): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.633249 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd" event={"ID":"6a470e3c-9cac-463b-a253-308f3c386725","Type":"ContainerStarted","Data":"69b7ec915d9e7786ddacc21523eeeba0aec5f6e6dc260f87a589c5c0c9b8f302"} Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.633247 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hkjh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-g4sq7_openstack-operators(66ead12f-65d1-4438-80b0-1a747105d7fc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.633773 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8" podUID="d4e33a37-ac1e-408c-b0d3-a1352daa67af" Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.634483 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg" podUID="88757539-b3d4-4de5-bc96-a4cd13d5a203" Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.634519 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g4sq7" podUID="66ead12f-65d1-4438-80b0-1a747105d7fc" Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.635140 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lc6v9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cb74df96-xbxjh_openstack-operators(05a605e3-814f-45a4-8461-47cbb3330652): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.635246 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hzqdn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-864885998-pjmtx_openstack-operators(5a86bde8-f04d-4bfa-842f-6c960d7232fb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.635347 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-q8p2b" event={"ID":"288e97c2-c236-4177-9a52-bcf1c6c69faa","Type":"ContainerStarted","Data":"23596bab9c9c01cfe12091d3c8ba2cddec13cb349e50c1aa110ae25301467079"} Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.636232 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh" podUID="05a605e3-814f-45a4-8461-47cbb3330652" Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.637314 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-864885998-pjmtx" podUID="5a86bde8-f04d-4bfa-842f-6c960d7232fb" Nov 25 16:13:05 crc kubenswrapper[4743]: I1125 16:13:05.680768 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94418bc2-d439-451f-91c2-c457a200825e-cert\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8\" (UID: \"94418bc2-d439-451f-91c2-c457a200825e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.681156 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 16:13:05 crc kubenswrapper[4743]: E1125 16:13:05.681247 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94418bc2-d439-451f-91c2-c457a200825e-cert podName:94418bc2-d439-451f-91c2-c457a200825e nodeName:}" failed. No retries permitted until 2025-11-25 16:13:07.681224417 +0000 UTC m=+866.803063966 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94418bc2-d439-451f-91c2-c457a200825e-cert") pod "openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" (UID: "94418bc2-d439-451f-91c2-c457a200825e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 16:13:06 crc kubenswrapper[4743]: I1125 16:13:06.090950 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-webhook-certs\") pod \"openstack-operator-controller-manager-746c9d5b4f-z2hm7\" (UID: \"20c829b2-be6f-4f96-85c1-21279d871c99\") " pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:06 crc kubenswrapper[4743]: E1125 16:13:06.091020 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 25 16:13:06 crc kubenswrapper[4743]: E1125 16:13:06.091085 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-webhook-certs podName:20c829b2-be6f-4f96-85c1-21279d871c99 nodeName:}" failed. No retries permitted until 2025-11-25 16:13:08.091068917 +0000 UTC m=+867.212908466 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-webhook-certs") pod "openstack-operator-controller-manager-746c9d5b4f-z2hm7" (UID: "20c829b2-be6f-4f96-85c1-21279d871c99") : secret "webhook-server-cert" not found Nov 25 16:13:06 crc kubenswrapper[4743]: I1125 16:13:06.761775 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr" event={"ID":"e2417720-74c0-4232-9f99-cdc10e485c91","Type":"ContainerStarted","Data":"cf58664e2031c36e24a77518cc95ceb604d24902a63b174da3f505053d2a88d4"} Nov 25 16:13:06 crc kubenswrapper[4743]: E1125 16:13:06.772621 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c0b5f124a37c1538042c0e63f0978429572e2a851d7f3a6eb80de09b86d755a0\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr" podUID="e2417720-74c0-4232-9f99-cdc10e485c91" Nov 25 16:13:06 crc kubenswrapper[4743]: I1125 16:13:06.775214 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" event={"ID":"d0cd465a-f903-48ef-aca1-839a390d3f12","Type":"ContainerStarted","Data":"f76c3af1b5168c32ab948b03ddd34dd0f9a4eff19b5bf9423af5879c16dc76e5"} Nov 25 16:13:06 crc kubenswrapper[4743]: I1125 16:13:06.776568 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8" event={"ID":"d4e33a37-ac1e-408c-b0d3-a1352daa67af","Type":"ContainerStarted","Data":"bf614787191eb370d8af61ee69939ef030e8aa9fed082829312fe9ad390dc179"} Nov 25 16:13:06 crc kubenswrapper[4743]: E1125 16:13:06.779181 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8" podUID="d4e33a37-ac1e-408c-b0d3-a1352daa67af" Nov 25 16:13:06 crc kubenswrapper[4743]: I1125 16:13:06.784994 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g4sq7" event={"ID":"66ead12f-65d1-4438-80b0-1a747105d7fc","Type":"ContainerStarted","Data":"4b1443fdf5321bbb7ff98e09a094576ea07dbdbe0ea359d30f5fc63e632f62d2"} Nov 25 16:13:06 crc kubenswrapper[4743]: E1125 16:13:06.789349 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g4sq7" podUID="66ead12f-65d1-4438-80b0-1a747105d7fc" Nov 25 16:13:06 crc kubenswrapper[4743]: I1125 16:13:06.790320 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h" event={"ID":"9caca3f1-e43f-47ab-aa8a-1248a30cfda4","Type":"ContainerStarted","Data":"d3caf004b41c5f262088e1d219dd5497ba235bc4b17307f61cfd945f2d309382"} Nov 25 16:13:06 crc kubenswrapper[4743]: E1125 16:13:06.796417 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:b749a5dd8bc718875c3f5e81b38d54d003be77ab92de4a3e9f9595566496a58a\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h" podUID="9caca3f1-e43f-47ab-aa8a-1248a30cfda4" Nov 25 16:13:06 crc kubenswrapper[4743]: I1125 16:13:06.796634 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8" event={"ID":"7be9f6fc-3582-4e14-a452-daa24035d10e","Type":"ContainerStarted","Data":"045362023d7941c0ec098a22098b848d08cf9f62d23713c72596e039ea540b7c"} Nov 25 16:13:06 crc kubenswrapper[4743]: I1125 16:13:06.798512 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-djxpp" event={"ID":"9f422105-6959-44e5-93e2-901fd9b84dfc","Type":"ContainerStarted","Data":"bdb3e714d946b35fcc25642940b1289987767ad4f91a76e56b5ae23f74fb3fe4"} Nov 25 16:13:06 crc kubenswrapper[4743]: E1125 16:13:06.798862 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8" podUID="7be9f6fc-3582-4e14-a452-daa24035d10e" Nov 25 16:13:06 crc kubenswrapper[4743]: I1125 16:13:06.799580 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh" event={"ID":"05a605e3-814f-45a4-8461-47cbb3330652","Type":"ContainerStarted","Data":"361d62ac1e17e29df2e186ae91d77d63b10b9daa9ce1c495acd1d89c1e404f0a"} Nov 25 16:13:06 crc kubenswrapper[4743]: E1125 16:13:06.819148 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh" podUID="05a605e3-814f-45a4-8461-47cbb3330652" Nov 25 16:13:06 crc kubenswrapper[4743]: I1125 16:13:06.823435 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg" event={"ID":"88757539-b3d4-4de5-bc96-a4cd13d5a203","Type":"ContainerStarted","Data":"2997316950df81c7ee82c300c51a9427c7afab706fe83ef39229a1aa132e0e4d"} Nov 25 16:13:06 crc kubenswrapper[4743]: I1125 16:13:06.827199 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-pjmtx" event={"ID":"5a86bde8-f04d-4bfa-842f-6c960d7232fb","Type":"ContainerStarted","Data":"43c9222432737e0be9d5d827b59e90a0fdba0ac4a23037c87e7bfc09a199e32d"} Nov 25 16:13:06 crc kubenswrapper[4743]: E1125 16:13:06.830091 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:c053e34316044f14929e16e4f0d97f9f1b24cb68b5e22b925ca74c66aaaed0a7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg" podUID="88757539-b3d4-4de5-bc96-a4cd13d5a203" Nov 25 16:13:06 crc kubenswrapper[4743]: E1125 16:13:06.837673 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-864885998-pjmtx" podUID="5a86bde8-f04d-4bfa-842f-6c960d7232fb" Nov 25 16:13:07 crc kubenswrapper[4743]: I1125 16:13:07.731898 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94418bc2-d439-451f-91c2-c457a200825e-cert\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8\" (UID: \"94418bc2-d439-451f-91c2-c457a200825e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" Nov 25 16:13:07 crc kubenswrapper[4743]: I1125 16:13:07.737576 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94418bc2-d439-451f-91c2-c457a200825e-cert\") pod \"openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8\" (UID: \"94418bc2-d439-451f-91c2-c457a200825e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" Nov 25 16:13:07 crc kubenswrapper[4743]: E1125 16:13:07.841883 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g4sq7" podUID="66ead12f-65d1-4438-80b0-1a747105d7fc" Nov 25 16:13:07 crc kubenswrapper[4743]: E1125 16:13:07.842507 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8" podUID="7be9f6fc-3582-4e14-a452-daa24035d10e" Nov 25 16:13:07 crc kubenswrapper[4743]: E1125 16:13:07.842775 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:b749a5dd8bc718875c3f5e81b38d54d003be77ab92de4a3e9f9595566496a58a\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h" podUID="9caca3f1-e43f-47ab-aa8a-1248a30cfda4" Nov 25 16:13:07 crc kubenswrapper[4743]: E1125 16:13:07.842968 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-864885998-pjmtx" podUID="5a86bde8-f04d-4bfa-842f-6c960d7232fb" Nov 25 16:13:07 crc kubenswrapper[4743]: E1125 16:13:07.843080 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8" podUID="d4e33a37-ac1e-408c-b0d3-a1352daa67af" Nov 25 16:13:07 crc kubenswrapper[4743]: E1125 16:13:07.843126 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh" podUID="05a605e3-814f-45a4-8461-47cbb3330652" Nov 25 16:13:07 crc kubenswrapper[4743]: E1125 16:13:07.843172 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:c053e34316044f14929e16e4f0d97f9f1b24cb68b5e22b925ca74c66aaaed0a7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg" podUID="88757539-b3d4-4de5-bc96-a4cd13d5a203" Nov 25 16:13:07 crc kubenswrapper[4743]: E1125 16:13:07.843222 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c0b5f124a37c1538042c0e63f0978429572e2a851d7f3a6eb80de09b86d755a0\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr" podUID="e2417720-74c0-4232-9f99-cdc10e485c91" Nov 25 16:13:07 crc kubenswrapper[4743]: I1125 16:13:07.856701 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" Nov 25 16:13:08 crc kubenswrapper[4743]: I1125 16:13:08.142893 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-webhook-certs\") pod \"openstack-operator-controller-manager-746c9d5b4f-z2hm7\" (UID: \"20c829b2-be6f-4f96-85c1-21279d871c99\") " pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:08 crc kubenswrapper[4743]: I1125 16:13:08.146114 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/20c829b2-be6f-4f96-85c1-21279d871c99-webhook-certs\") pod \"openstack-operator-controller-manager-746c9d5b4f-z2hm7\" (UID: \"20c829b2-be6f-4f96-85c1-21279d871c99\") " pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:08 crc kubenswrapper[4743]: I1125 16:13:08.402347 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:13 crc kubenswrapper[4743]: I1125 16:13:13.886638 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-mwm9p" event={"ID":"e0a0f65a-b18b-479e-8ef8-5f7c6c36ccdf","Type":"ContainerStarted","Data":"6c8204c89c4e49fac73b70a4949e7550bfe3bf4fcac7bf629ba8da648cfe99c8"} Nov 25 16:13:13 crc kubenswrapper[4743]: I1125 16:13:13.890388 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-82tqm" event={"ID":"9b77ce44-3830-488e-ac40-97af4d969f6e","Type":"ContainerStarted","Data":"d729e0579dbbebcc5d73fe97ac85b2a2dfaf0c09af1a18e4e1c98b98a3bd7f5a"} Nov 25 16:13:14 crc kubenswrapper[4743]: I1125 16:13:14.070029 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8"] Nov 25 16:13:14 crc kubenswrapper[4743]: I1125 16:13:14.178099 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7"] Nov 25 16:13:14 crc kubenswrapper[4743]: E1125 16:13:14.350348 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z6zqp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-d5cc86f4b-zdjj7_openstack-operators(d0cd465a-f903-48ef-aca1-839a390d3f12): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:14 crc kubenswrapper[4743]: E1125 16:13:14.350424 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jzcqp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-7d695c9b56-fwlrd_openstack-operators(6a470e3c-9cac-463b-a253-308f3c386725): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 16:13:14 crc kubenswrapper[4743]: E1125 16:13:14.351766 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd" podUID="6a470e3c-9cac-463b-a253-308f3c386725" Nov 25 16:13:14 crc kubenswrapper[4743]: E1125 16:13:14.351827 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" podUID="d0cd465a-f903-48ef-aca1-839a390d3f12" Nov 25 16:13:14 crc kubenswrapper[4743]: I1125 16:13:14.925738 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-q8p2b" event={"ID":"288e97c2-c236-4177-9a52-bcf1c6c69faa","Type":"ContainerStarted","Data":"4f73b1472b79bd9095b039c029195b1a42ac190add9d10ce3a463179cea8c5de"} Nov 25 16:13:14 crc kubenswrapper[4743]: I1125 16:13:14.929741 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wk72z" event={"ID":"0729dc1e-3e2c-410e-892d-ef4773882665","Type":"ContainerStarted","Data":"16584ebd3a7bd9a1e2a0cb8780c44b9988e283297f7643f021156e19e9e1c095"} Nov 25 16:13:14 crc kubenswrapper[4743]: I1125 16:13:14.942995 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f7g4h" event={"ID":"38527a3c-d051-4354-a8ca-0692153762f1","Type":"ContainerStarted","Data":"ee9bd2b7b44524e41ed299ab3cdae5ed16638960b865da89293235cdad922372"} Nov 25 16:13:14 crc kubenswrapper[4743]: I1125 16:13:14.973274 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" event={"ID":"d0cd465a-f903-48ef-aca1-839a390d3f12","Type":"ContainerStarted","Data":"a0eaf62ae112178c049a36aeef847afc75f927cbbfe996f50686a21c0e2c69dd"} Nov 25 16:13:14 crc kubenswrapper[4743]: I1125 16:13:14.974121 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" Nov 25 16:13:14 crc kubenswrapper[4743]: E1125 16:13:14.975010 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" podUID="d0cd465a-f903-48ef-aca1-839a390d3f12" Nov 25 16:13:14 crc kubenswrapper[4743]: I1125 16:13:14.977490 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-z5rhv" event={"ID":"bc109d32-7111-40b6-aff6-7596c933114f","Type":"ContainerStarted","Data":"4749661b450fc30e17883c01824e3bc69f5ab495f0c5ee69322ffe3720e1b714"} Nov 25 16:13:14 crc kubenswrapper[4743]: I1125 16:13:14.989247 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-prmnw" event={"ID":"29690625-5e1d-417a-b0e5-9d74645b31f7","Type":"ContainerStarted","Data":"9ee2dee66d59f475d12c8fe1366408ccd57e692f5c343ba66c6fce62e2c7c807"} Nov 25 16:13:14 crc kubenswrapper[4743]: I1125 16:13:14.991706 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-djxpp" event={"ID":"9f422105-6959-44e5-93e2-901fd9b84dfc","Type":"ContainerStarted","Data":"b20bbfdc9e35884df67aa97d69e617fdf123601633cd7076b287277e38003e2c"} Nov 25 16:13:14 crc kubenswrapper[4743]: I1125 16:13:14.994261 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-j9zq7" event={"ID":"8d418847-cf8f-4977-bc14-3d4b64591e68","Type":"ContainerStarted","Data":"07f55603ae4bd686e37f0636474465cdc9092ec1238220fc00c1671a85ec3c0b"} Nov 25 16:13:14 crc kubenswrapper[4743]: I1125 16:13:14.996717 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-zq7mp" event={"ID":"aebddcf8-77ce-4317-94c3-f29b45f93686","Type":"ContainerStarted","Data":"5f8f8e76df4a7f82052180b3f5253f41d9107696dac1c8802b2b05e1a162a2d6"} Nov 25 16:13:15 crc kubenswrapper[4743]: I1125 16:13:15.016966 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-bvwmw" event={"ID":"44e2f27d-a5d4-48cf-90f5-2f5598a2295a","Type":"ContainerStarted","Data":"096cfd6aaefcf5df4b511545e95fca283ff050d2612499915c5b2da9a366ef0d"} Nov 25 16:13:15 crc kubenswrapper[4743]: I1125 16:13:15.018417 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" event={"ID":"94418bc2-d439-451f-91c2-c457a200825e","Type":"ContainerStarted","Data":"a4a912e94acd32ffa3bf78a0ebde573323639cf35f4d92d123004b934916ce79"} Nov 25 16:13:15 crc kubenswrapper[4743]: I1125 16:13:15.020431 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" event={"ID":"20c829b2-be6f-4f96-85c1-21279d871c99","Type":"ContainerStarted","Data":"a9aab6a415bc0030cd12e2723d0425e8783135c9068a82bc5bdf73e59676b046"} Nov 25 16:13:15 crc kubenswrapper[4743]: I1125 16:13:15.020458 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" event={"ID":"20c829b2-be6f-4f96-85c1-21279d871c99","Type":"ContainerStarted","Data":"1fe99ee1ad849b4b70b1974cf4e64ae47bf54596f6f4b82d6306cf4b23c98f1d"} Nov 25 16:13:15 crc kubenswrapper[4743]: I1125 16:13:15.027284 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:15 crc kubenswrapper[4743]: I1125 16:13:15.055630 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd" event={"ID":"6a470e3c-9cac-463b-a253-308f3c386725","Type":"ContainerStarted","Data":"79394b7cbdc32205e90b91d5f0f9dcbbe7349ae36dcf892dbcda80cf9b9c1dbe"} Nov 25 16:13:15 crc kubenswrapper[4743]: I1125 16:13:15.056198 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd" Nov 25 16:13:15 crc kubenswrapper[4743]: E1125 16:13:15.060848 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd" podUID="6a470e3c-9cac-463b-a253-308f3c386725" Nov 25 16:13:15 crc kubenswrapper[4743]: I1125 16:13:15.089156 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" podStartSLOduration=11.089134924 podStartE2EDuration="11.089134924s" podCreationTimestamp="2025-11-25 16:13:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:13:15.066522866 +0000 UTC m=+874.188362415" watchObservedRunningTime="2025-11-25 16:13:15.089134924 +0000 UTC m=+874.210974473" Nov 25 16:13:16 crc kubenswrapper[4743]: E1125 16:13:16.062586 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" podUID="d0cd465a-f903-48ef-aca1-839a390d3f12" Nov 25 16:13:16 crc kubenswrapper[4743]: E1125 16:13:16.065130 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd" podUID="6a470e3c-9cac-463b-a253-308f3c386725" Nov 25 16:13:18 crc kubenswrapper[4743]: I1125 16:13:18.417687 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9hc45"] Nov 25 16:13:18 crc kubenswrapper[4743]: I1125 16:13:18.421116 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:18 crc kubenswrapper[4743]: I1125 16:13:18.447756 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hc45"] Nov 25 16:13:18 crc kubenswrapper[4743]: I1125 16:13:18.513319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh5f7\" (UniqueName: \"kubernetes.io/projected/1e373339-e739-4116-a575-239b6976769f-kube-api-access-qh5f7\") pod \"redhat-marketplace-9hc45\" (UID: \"1e373339-e739-4116-a575-239b6976769f\") " pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:18 crc kubenswrapper[4743]: I1125 16:13:18.513451 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e373339-e739-4116-a575-239b6976769f-catalog-content\") pod \"redhat-marketplace-9hc45\" (UID: \"1e373339-e739-4116-a575-239b6976769f\") " pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:18 crc kubenswrapper[4743]: I1125 16:13:18.513490 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e373339-e739-4116-a575-239b6976769f-utilities\") pod \"redhat-marketplace-9hc45\" (UID: \"1e373339-e739-4116-a575-239b6976769f\") " pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:18 crc kubenswrapper[4743]: I1125 16:13:18.615026 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh5f7\" (UniqueName: \"kubernetes.io/projected/1e373339-e739-4116-a575-239b6976769f-kube-api-access-qh5f7\") pod \"redhat-marketplace-9hc45\" (UID: \"1e373339-e739-4116-a575-239b6976769f\") " pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:18 crc kubenswrapper[4743]: I1125 16:13:18.615153 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e373339-e739-4116-a575-239b6976769f-catalog-content\") pod \"redhat-marketplace-9hc45\" (UID: \"1e373339-e739-4116-a575-239b6976769f\") " pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:18 crc kubenswrapper[4743]: I1125 16:13:18.615181 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e373339-e739-4116-a575-239b6976769f-utilities\") pod \"redhat-marketplace-9hc45\" (UID: \"1e373339-e739-4116-a575-239b6976769f\") " pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:18 crc kubenswrapper[4743]: I1125 16:13:18.615766 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e373339-e739-4116-a575-239b6976769f-catalog-content\") pod \"redhat-marketplace-9hc45\" (UID: \"1e373339-e739-4116-a575-239b6976769f\") " pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:18 crc kubenswrapper[4743]: I1125 16:13:18.615806 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e373339-e739-4116-a575-239b6976769f-utilities\") pod \"redhat-marketplace-9hc45\" (UID: \"1e373339-e739-4116-a575-239b6976769f\") " pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:18 crc kubenswrapper[4743]: I1125 16:13:18.635442 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh5f7\" (UniqueName: \"kubernetes.io/projected/1e373339-e739-4116-a575-239b6976769f-kube-api-access-qh5f7\") pod \"redhat-marketplace-9hc45\" (UID: \"1e373339-e739-4116-a575-239b6976769f\") " pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:18 crc kubenswrapper[4743]: I1125 16:13:18.764528 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.089483 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wk72z" event={"ID":"0729dc1e-3e2c-410e-892d-ef4773882665","Type":"ContainerStarted","Data":"f080bdb595a354545693a528ac634084c1c632539af850bd10250bc7fe24cb34"} Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.089850 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wk72z" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.091029 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" event={"ID":"94418bc2-d439-451f-91c2-c457a200825e","Type":"ContainerStarted","Data":"d7732e977b9de704c014505f4749d8d6fcfb5bf8cc8ec6ce0a6fd425a3117545"} Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.091054 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" event={"ID":"94418bc2-d439-451f-91c2-c457a200825e","Type":"ContainerStarted","Data":"a4260df6be755560f14c1298ad0928c129f2fe1515968ccb7b186cb8fff4d89d"} Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.091173 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.092078 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wk72z" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.092693 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-bvwmw" event={"ID":"44e2f27d-a5d4-48cf-90f5-2f5598a2295a","Type":"ContainerStarted","Data":"22edddc555a8d714bc6e4558cb2b80f76efaa46ba3109bbf3f2ca5be795b2818"} Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.092885 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-bvwmw" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.095015 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-bvwmw" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.095490 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f7g4h" event={"ID":"38527a3c-d051-4354-a8ca-0692153762f1","Type":"ContainerStarted","Data":"b885bc14016b1d9ba79d84b9910a4236d2d8b535c621b9a154f26986de8fa699"} Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.095686 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f7g4h" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.097012 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-djxpp" event={"ID":"9f422105-6959-44e5-93e2-901fd9b84dfc","Type":"ContainerStarted","Data":"e486e3fd25860078972f9b9a8e16462e3517008e265bda6c42c2e32f79d60105"} Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.097257 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-djxpp" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.097301 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f7g4h" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.098732 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-j9zq7" event={"ID":"8d418847-cf8f-4977-bc14-3d4b64591e68","Type":"ContainerStarted","Data":"07b76ccb4676b97300ee560e35a89e2907737797fe1706a4a72b262fab70823f"} Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.098919 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-j9zq7" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.101186 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-q8p2b" event={"ID":"288e97c2-c236-4177-9a52-bcf1c6c69faa","Type":"ContainerStarted","Data":"b1208abd3592d8c821f4d515f57a30b751c0b8e3b725cbab50f90a834b6c3bae"} Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.101387 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-q8p2b" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.102077 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-j9zq7" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.102186 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-djxpp" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.104376 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-82tqm" event={"ID":"9b77ce44-3830-488e-ac40-97af4d969f6e","Type":"ContainerStarted","Data":"b96398397bc6666d156dce52c7a615488840ce5065c10f78dcc39e90e4d0fb90"} Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.104603 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-82tqm" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.106085 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-z5rhv" event={"ID":"bc109d32-7111-40b6-aff6-7596c933114f","Type":"ContainerStarted","Data":"bb0962a037a086c214a718430048174f2ac38c0bf7a54edf2f35c57b367889ef"} Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.106410 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-z5rhv" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.106459 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-q8p2b" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.106516 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-82tqm" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.107786 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-z5rhv" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.108511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-prmnw" event={"ID":"29690625-5e1d-417a-b0e5-9d74645b31f7","Type":"ContainerStarted","Data":"ae848c1274d59d7f17e8efd3c31f8c83f0077d6e600e6653a57c1cced9514a60"} Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.109196 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-774b86978c-prmnw" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.110929 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-zq7mp" event={"ID":"aebddcf8-77ce-4317-94c3-f29b45f93686","Type":"ContainerStarted","Data":"61716b3682abf41cfb56cd8e478d14fa91eeb5e9302111d694474bb20a388eb1"} Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.111172 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-zq7mp" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.112289 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-774b86978c-prmnw" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.112924 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-zq7mp" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.113614 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-mwm9p" event={"ID":"e0a0f65a-b18b-479e-8ef8-5f7c6c36ccdf","Type":"ContainerStarted","Data":"1ff6b7c3a69c55a89b2b5756f513c5470aac3166c19ec1ae75bad139b313b3b2"} Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.113761 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-mwm9p" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.115237 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-mwm9p" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.121736 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-wk72z" podStartSLOduration=3.479842028 podStartE2EDuration="16.121722594s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.156434013 +0000 UTC m=+864.278273562" lastFinishedPulling="2025-11-25 16:13:17.798314569 +0000 UTC m=+876.920154128" observedRunningTime="2025-11-25 16:13:19.115427286 +0000 UTC m=+878.237266845" watchObservedRunningTime="2025-11-25 16:13:19.121722594 +0000 UTC m=+878.243562143" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.151439 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" podStartSLOduration=12.535971155 podStartE2EDuration="16.151424944s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:14.12799209 +0000 UTC m=+873.249831639" lastFinishedPulling="2025-11-25 16:13:17.743445859 +0000 UTC m=+876.865285428" observedRunningTime="2025-11-25 16:13:19.147896483 +0000 UTC m=+878.269736042" watchObservedRunningTime="2025-11-25 16:13:19.151424944 +0000 UTC m=+878.273264493" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.184799 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-82tqm" podStartSLOduration=3.411347092 podStartE2EDuration="16.184780759s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:04.985935711 +0000 UTC m=+864.107775260" lastFinishedPulling="2025-11-25 16:13:17.759369378 +0000 UTC m=+876.881208927" observedRunningTime="2025-11-25 16:13:19.183044795 +0000 UTC m=+878.304884354" watchObservedRunningTime="2025-11-25 16:13:19.184780759 +0000 UTC m=+878.306620308" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.204285 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hc45"] Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.246955 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-f7g4h" podStartSLOduration=3.635029011 podStartE2EDuration="16.246932386s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.177203505 +0000 UTC m=+864.299043054" lastFinishedPulling="2025-11-25 16:13:17.78910688 +0000 UTC m=+876.910946429" observedRunningTime="2025-11-25 16:13:19.209921607 +0000 UTC m=+878.331761166" watchObservedRunningTime="2025-11-25 16:13:19.246932386 +0000 UTC m=+878.368771935" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.269525 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-zq7mp" podStartSLOduration=3.6300006529999997 podStartE2EDuration="16.269509184s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.169824233 +0000 UTC m=+864.291663782" lastFinishedPulling="2025-11-25 16:13:17.809332744 +0000 UTC m=+876.931172313" observedRunningTime="2025-11-25 16:13:19.24512967 +0000 UTC m=+878.366969219" watchObservedRunningTime="2025-11-25 16:13:19.269509184 +0000 UTC m=+878.391348733" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.269966 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-djxpp" podStartSLOduration=4.07225052 podStartE2EDuration="16.269962178s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.600628742 +0000 UTC m=+864.722468291" lastFinishedPulling="2025-11-25 16:13:17.79834037 +0000 UTC m=+876.920179949" observedRunningTime="2025-11-25 16:13:19.268067129 +0000 UTC m=+878.389906698" watchObservedRunningTime="2025-11-25 16:13:19.269962178 +0000 UTC m=+878.391801727" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.296706 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-q8p2b" podStartSLOduration=3.527102899 podStartE2EDuration="16.296691875s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.026576055 +0000 UTC m=+864.148415604" lastFinishedPulling="2025-11-25 16:13:17.796165021 +0000 UTC m=+876.918004580" observedRunningTime="2025-11-25 16:13:19.295802008 +0000 UTC m=+878.417641577" watchObservedRunningTime="2025-11-25 16:13:19.296691875 +0000 UTC m=+878.418531424" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.326849 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-j9zq7" podStartSLOduration=4.098237515 podStartE2EDuration="16.32682775s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.585321091 +0000 UTC m=+864.707160661" lastFinishedPulling="2025-11-25 16:13:17.813911337 +0000 UTC m=+876.935750896" observedRunningTime="2025-11-25 16:13:19.313890464 +0000 UTC m=+878.435730033" watchObservedRunningTime="2025-11-25 16:13:19.32682775 +0000 UTC m=+878.448667299" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.343994 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-mwm9p" podStartSLOduration=4.126554242 podStartE2EDuration="16.343976707s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.580965886 +0000 UTC m=+864.702805435" lastFinishedPulling="2025-11-25 16:13:17.798388311 +0000 UTC m=+876.920227900" observedRunningTime="2025-11-25 16:13:19.340585991 +0000 UTC m=+878.462447071" watchObservedRunningTime="2025-11-25 16:13:19.343976707 +0000 UTC m=+878.465816256" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.358950 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-bvwmw" podStartSLOduration=3.712421966 podStartE2EDuration="16.358931576s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.182701827 +0000 UTC m=+864.304541376" lastFinishedPulling="2025-11-25 16:13:17.829211427 +0000 UTC m=+876.951050986" observedRunningTime="2025-11-25 16:13:19.355652523 +0000 UTC m=+878.477492102" watchObservedRunningTime="2025-11-25 16:13:19.358931576 +0000 UTC m=+878.480771125" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.387247 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-z5rhv" podStartSLOduration=3.768147812 podStartE2EDuration="16.387225582s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.165396634 +0000 UTC m=+864.287236183" lastFinishedPulling="2025-11-25 16:13:17.784474414 +0000 UTC m=+876.906313953" observedRunningTime="2025-11-25 16:13:19.383624449 +0000 UTC m=+878.505464018" watchObservedRunningTime="2025-11-25 16:13:19.387225582 +0000 UTC m=+878.509065131" Nov 25 16:13:19 crc kubenswrapper[4743]: I1125 16:13:19.406418 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-774b86978c-prmnw" podStartSLOduration=3.7960233949999997 podStartE2EDuration="16.406396643s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.178575377 +0000 UTC m=+864.300414926" lastFinishedPulling="2025-11-25 16:13:17.788948615 +0000 UTC m=+876.910788174" observedRunningTime="2025-11-25 16:13:19.404200504 +0000 UTC m=+878.526040073" watchObservedRunningTime="2025-11-25 16:13:19.406396643 +0000 UTC m=+878.528236192" Nov 25 16:13:20 crc kubenswrapper[4743]: I1125 16:13:20.077801 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:13:20 crc kubenswrapper[4743]: I1125 16:13:20.077852 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:13:20 crc kubenswrapper[4743]: I1125 16:13:20.121733 4743 generic.go:334] "Generic (PLEG): container finished" podID="1e373339-e739-4116-a575-239b6976769f" containerID="9c82ea444a59353f4bc68f6358265c7dba51eb8aec6b682811a03e02eb8dd659" exitCode=0 Nov 25 16:13:20 crc kubenswrapper[4743]: I1125 16:13:20.122267 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hc45" event={"ID":"1e373339-e739-4116-a575-239b6976769f","Type":"ContainerDied","Data":"9c82ea444a59353f4bc68f6358265c7dba51eb8aec6b682811a03e02eb8dd659"} Nov 25 16:13:20 crc kubenswrapper[4743]: I1125 16:13:20.122295 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hc45" event={"ID":"1e373339-e739-4116-a575-239b6976769f","Type":"ContainerStarted","Data":"38de740ac7fa83576f7a055e6ee95ae37b9b6e2c504205c34b699320a5ec6bab"} Nov 25 16:13:21 crc kubenswrapper[4743]: I1125 16:13:21.134859 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8" event={"ID":"7be9f6fc-3582-4e14-a452-daa24035d10e","Type":"ContainerStarted","Data":"cc10053abc9046d17aa4224cc1fabf8c70651c4e251d74ba0e3d115bc6ff8e5e"} Nov 25 16:13:21 crc kubenswrapper[4743]: I1125 16:13:21.135249 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8" event={"ID":"7be9f6fc-3582-4e14-a452-daa24035d10e","Type":"ContainerStarted","Data":"ab9257fccfd389403f8f0db5f80ef8253fda2f1eadbe883d5226b5d841a2977d"} Nov 25 16:13:21 crc kubenswrapper[4743]: I1125 16:13:21.137567 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8" Nov 25 16:13:21 crc kubenswrapper[4743]: I1125 16:13:21.142425 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hc45" event={"ID":"1e373339-e739-4116-a575-239b6976769f","Type":"ContainerStarted","Data":"7cc2619cdcdafaf2bff9dd45db9367ba0af640f1a0bdc4c8ecfd18507f524d4a"} Nov 25 16:13:21 crc kubenswrapper[4743]: I1125 16:13:21.153564 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8" podStartSLOduration=3.383123459 podStartE2EDuration="18.153547895s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.613551746 +0000 UTC m=+864.735391295" lastFinishedPulling="2025-11-25 16:13:20.383976182 +0000 UTC m=+879.505815731" observedRunningTime="2025-11-25 16:13:21.151858032 +0000 UTC m=+880.273697591" watchObservedRunningTime="2025-11-25 16:13:21.153547895 +0000 UTC m=+880.275387434" Nov 25 16:13:22 crc kubenswrapper[4743]: I1125 16:13:22.150408 4743 generic.go:334] "Generic (PLEG): container finished" podID="1e373339-e739-4116-a575-239b6976769f" containerID="7cc2619cdcdafaf2bff9dd45db9367ba0af640f1a0bdc4c8ecfd18507f524d4a" exitCode=0 Nov 25 16:13:22 crc kubenswrapper[4743]: I1125 16:13:22.150491 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hc45" event={"ID":"1e373339-e739-4116-a575-239b6976769f","Type":"ContainerDied","Data":"7cc2619cdcdafaf2bff9dd45db9367ba0af640f1a0bdc4c8ecfd18507f524d4a"} Nov 25 16:13:23 crc kubenswrapper[4743]: I1125 16:13:23.159740 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg" event={"ID":"88757539-b3d4-4de5-bc96-a4cd13d5a203","Type":"ContainerStarted","Data":"02d2cf92b0be2732791690bcdf29bfe7e1cb77580c810befeabe0ae5f329af89"} Nov 25 16:13:23 crc kubenswrapper[4743]: I1125 16:13:23.844990 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd" Nov 25 16:13:24 crc kubenswrapper[4743]: I1125 16:13:24.005166 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" Nov 25 16:13:27 crc kubenswrapper[4743]: I1125 16:13:27.864069 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8" Nov 25 16:13:28 crc kubenswrapper[4743]: I1125 16:13:28.410805 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-746c9d5b4f-z2hm7" Nov 25 16:13:29 crc kubenswrapper[4743]: I1125 16:13:29.075107 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k8t66"] Nov 25 16:13:29 crc kubenswrapper[4743]: I1125 16:13:29.077129 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:29 crc kubenswrapper[4743]: I1125 16:13:29.085368 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k8t66"] Nov 25 16:13:29 crc kubenswrapper[4743]: I1125 16:13:29.169550 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f57a9957-1565-46cf-a737-2c6dd09e74f5-catalog-content\") pod \"certified-operators-k8t66\" (UID: \"f57a9957-1565-46cf-a737-2c6dd09e74f5\") " pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:29 crc kubenswrapper[4743]: I1125 16:13:29.169651 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7k9z\" (UniqueName: \"kubernetes.io/projected/f57a9957-1565-46cf-a737-2c6dd09e74f5-kube-api-access-g7k9z\") pod \"certified-operators-k8t66\" (UID: \"f57a9957-1565-46cf-a737-2c6dd09e74f5\") " pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:29 crc kubenswrapper[4743]: I1125 16:13:29.169700 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f57a9957-1565-46cf-a737-2c6dd09e74f5-utilities\") pod \"certified-operators-k8t66\" (UID: \"f57a9957-1565-46cf-a737-2c6dd09e74f5\") " pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:29 crc kubenswrapper[4743]: I1125 16:13:29.270971 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7k9z\" (UniqueName: \"kubernetes.io/projected/f57a9957-1565-46cf-a737-2c6dd09e74f5-kube-api-access-g7k9z\") pod \"certified-operators-k8t66\" (UID: \"f57a9957-1565-46cf-a737-2c6dd09e74f5\") " pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:29 crc kubenswrapper[4743]: I1125 16:13:29.271041 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f57a9957-1565-46cf-a737-2c6dd09e74f5-utilities\") pod \"certified-operators-k8t66\" (UID: \"f57a9957-1565-46cf-a737-2c6dd09e74f5\") " pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:29 crc kubenswrapper[4743]: I1125 16:13:29.271085 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f57a9957-1565-46cf-a737-2c6dd09e74f5-catalog-content\") pod \"certified-operators-k8t66\" (UID: \"f57a9957-1565-46cf-a737-2c6dd09e74f5\") " pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:29 crc kubenswrapper[4743]: I1125 16:13:29.271554 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f57a9957-1565-46cf-a737-2c6dd09e74f5-catalog-content\") pod \"certified-operators-k8t66\" (UID: \"f57a9957-1565-46cf-a737-2c6dd09e74f5\") " pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:29 crc kubenswrapper[4743]: I1125 16:13:29.271584 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f57a9957-1565-46cf-a737-2c6dd09e74f5-utilities\") pod \"certified-operators-k8t66\" (UID: \"f57a9957-1565-46cf-a737-2c6dd09e74f5\") " pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:29 crc kubenswrapper[4743]: I1125 16:13:29.294524 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7k9z\" (UniqueName: \"kubernetes.io/projected/f57a9957-1565-46cf-a737-2c6dd09e74f5-kube-api-access-g7k9z\") pod \"certified-operators-k8t66\" (UID: \"f57a9957-1565-46cf-a737-2c6dd09e74f5\") " pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:29 crc kubenswrapper[4743]: I1125 16:13:29.393505 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:30 crc kubenswrapper[4743]: I1125 16:13:30.408093 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k8t66"] Nov 25 16:13:30 crc kubenswrapper[4743]: W1125 16:13:30.418249 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf57a9957_1565_46cf_a737_2c6dd09e74f5.slice/crio-a0dd89617247339adee8e63d7b45554e1cb858c77511468229e2827d91401ab0 WatchSource:0}: Error finding container a0dd89617247339adee8e63d7b45554e1cb858c77511468229e2827d91401ab0: Status 404 returned error can't find the container with id a0dd89617247339adee8e63d7b45554e1cb858c77511468229e2827d91401ab0 Nov 25 16:13:31 crc kubenswrapper[4743]: I1125 16:13:31.214263 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8t66" event={"ID":"f57a9957-1565-46cf-a737-2c6dd09e74f5","Type":"ContainerStarted","Data":"a0dd89617247339adee8e63d7b45554e1cb858c77511468229e2827d91401ab0"} Nov 25 16:13:32 crc kubenswrapper[4743]: I1125 16:13:32.669747 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sfw7g"] Nov 25 16:13:32 crc kubenswrapper[4743]: I1125 16:13:32.672749 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:32 crc kubenswrapper[4743]: I1125 16:13:32.677375 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sfw7g"] Nov 25 16:13:32 crc kubenswrapper[4743]: I1125 16:13:32.723693 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd7bt\" (UniqueName: \"kubernetes.io/projected/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-kube-api-access-gd7bt\") pod \"redhat-operators-sfw7g\" (UID: \"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9\") " pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:32 crc kubenswrapper[4743]: I1125 16:13:32.723950 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-catalog-content\") pod \"redhat-operators-sfw7g\" (UID: \"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9\") " pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:32 crc kubenswrapper[4743]: I1125 16:13:32.724186 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-utilities\") pod \"redhat-operators-sfw7g\" (UID: \"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9\") " pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:32 crc kubenswrapper[4743]: I1125 16:13:32.825246 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd7bt\" (UniqueName: \"kubernetes.io/projected/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-kube-api-access-gd7bt\") pod \"redhat-operators-sfw7g\" (UID: \"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9\") " pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:32 crc kubenswrapper[4743]: I1125 16:13:32.825297 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-catalog-content\") pod \"redhat-operators-sfw7g\" (UID: \"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9\") " pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:32 crc kubenswrapper[4743]: I1125 16:13:32.825375 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-utilities\") pod \"redhat-operators-sfw7g\" (UID: \"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9\") " pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:32 crc kubenswrapper[4743]: I1125 16:13:32.825913 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-utilities\") pod \"redhat-operators-sfw7g\" (UID: \"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9\") " pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:32 crc kubenswrapper[4743]: I1125 16:13:32.826413 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-catalog-content\") pod \"redhat-operators-sfw7g\" (UID: \"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9\") " pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:32 crc kubenswrapper[4743]: I1125 16:13:32.849736 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd7bt\" (UniqueName: \"kubernetes.io/projected/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-kube-api-access-gd7bt\") pod \"redhat-operators-sfw7g\" (UID: \"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9\") " pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:32 crc kubenswrapper[4743]: I1125 16:13:32.996137 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:33 crc kubenswrapper[4743]: I1125 16:13:33.404269 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sfw7g"] Nov 25 16:13:33 crc kubenswrapper[4743]: W1125 16:13:33.406207 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode88d6fa2_edfb_40d6_89e7_4667d4a47ec9.slice/crio-c2b08e64626ef790beb1ac7412ce6f9d0ac89ffc23bea028b84cbfd652ef61d7 WatchSource:0}: Error finding container c2b08e64626ef790beb1ac7412ce6f9d0ac89ffc23bea028b84cbfd652ef61d7: Status 404 returned error can't find the container with id c2b08e64626ef790beb1ac7412ce6f9d0ac89ffc23bea028b84cbfd652ef61d7 Nov 25 16:13:34 crc kubenswrapper[4743]: I1125 16:13:34.236912 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfw7g" event={"ID":"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9","Type":"ContainerStarted","Data":"c2b08e64626ef790beb1ac7412ce6f9d0ac89ffc23bea028b84cbfd652ef61d7"} Nov 25 16:13:34 crc kubenswrapper[4743]: I1125 16:13:34.341059 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-g5bp8" Nov 25 16:13:36 crc kubenswrapper[4743]: I1125 16:13:36.253359 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg" event={"ID":"88757539-b3d4-4de5-bc96-a4cd13d5a203","Type":"ContainerStarted","Data":"7a2ef9c4fcb182408fd729a9be9acbb3a6d5082d8e2cd99c2ab717d9fe39b018"} Nov 25 16:13:36 crc kubenswrapper[4743]: I1125 16:13:36.254764 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8t66" event={"ID":"f57a9957-1565-46cf-a737-2c6dd09e74f5","Type":"ContainerStarted","Data":"dc78b27505ff085acde4a8ccaf94b7cb6bb7d355302bafa6baf79ff790a65f44"} Nov 25 16:13:36 crc kubenswrapper[4743]: I1125 16:13:36.256070 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-pjmtx" event={"ID":"5a86bde8-f04d-4bfa-842f-6c960d7232fb","Type":"ContainerStarted","Data":"b0952346fabd1657e7b69f336b76951740d45a71fcc0911f3cbe3e680ded0706"} Nov 25 16:13:36 crc kubenswrapper[4743]: I1125 16:13:36.257528 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" event={"ID":"d0cd465a-f903-48ef-aca1-839a390d3f12","Type":"ContainerStarted","Data":"e67164eaa1f8fba8d197fc653be0db0e3948e183c6a6fe2455266d8282d44466"} Nov 25 16:13:36 crc kubenswrapper[4743]: I1125 16:13:36.258749 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd" event={"ID":"6a470e3c-9cac-463b-a253-308f3c386725","Type":"ContainerStarted","Data":"54c72b36b2f6a15c56883aebd0d7f45d4a13c8f10acf79ece91d1c21125c2f8b"} Nov 25 16:13:36 crc kubenswrapper[4743]: I1125 16:13:36.260608 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hc45" event={"ID":"1e373339-e739-4116-a575-239b6976769f","Type":"ContainerStarted","Data":"fa6dd7c35335aa5b760a675139ded637962e8ad967acc70786ac98ec2a31e5d1"} Nov 25 16:13:36 crc kubenswrapper[4743]: I1125 16:13:36.261703 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g4sq7" event={"ID":"66ead12f-65d1-4438-80b0-1a747105d7fc","Type":"ContainerStarted","Data":"89eed30d87ab6d4f3806f5b9f9ab0274385dc20cb2d06d5ed7aaaf66daa012fb"} Nov 25 16:13:36 crc kubenswrapper[4743]: I1125 16:13:36.262772 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr" event={"ID":"e2417720-74c0-4232-9f99-cdc10e485c91","Type":"ContainerStarted","Data":"62ca8612b8910b3df3bb15cd8caa82fbf4f308c3f8574676c378f2c1f93bdf11"} Nov 25 16:13:36 crc kubenswrapper[4743]: I1125 16:13:36.263729 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh" event={"ID":"05a605e3-814f-45a4-8461-47cbb3330652","Type":"ContainerStarted","Data":"420ebf57b0f33c38ebff97cd3b185196a512ed041ec0d4ce0435d3a8a8d87703"} Nov 25 16:13:36 crc kubenswrapper[4743]: I1125 16:13:36.264835 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8" event={"ID":"d4e33a37-ac1e-408c-b0d3-a1352daa67af","Type":"ContainerStarted","Data":"1e81653fd0bcbb884e82398953e745dd28e48dcb8cfb27a05a96e9f3bc3970cd"} Nov 25 16:13:36 crc kubenswrapper[4743]: I1125 16:13:36.265930 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h" event={"ID":"9caca3f1-e43f-47ab-aa8a-1248a30cfda4","Type":"ContainerStarted","Data":"db484db7601de2f2668cf113fd863ea90c8943c1abc835637f19065e6c6899f1"} Nov 25 16:13:37 crc kubenswrapper[4743]: I1125 16:13:37.272345 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh" event={"ID":"05a605e3-814f-45a4-8461-47cbb3330652","Type":"ContainerStarted","Data":"443a2b361d44e193b0d35d9bd32135af622409c0dffa281c852b0f6bad45b782"} Nov 25 16:13:37 crc kubenswrapper[4743]: I1125 16:13:37.273883 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h" event={"ID":"9caca3f1-e43f-47ab-aa8a-1248a30cfda4","Type":"ContainerStarted","Data":"784750a1b18f4a77b4998c037cebb55d50073db1fca68b8a7940b707a5359b2b"} Nov 25 16:13:37 crc kubenswrapper[4743]: I1125 16:13:37.275510 4743 generic.go:334] "Generic (PLEG): container finished" podID="f57a9957-1565-46cf-a737-2c6dd09e74f5" containerID="dc78b27505ff085acde4a8ccaf94b7cb6bb7d355302bafa6baf79ff790a65f44" exitCode=0 Nov 25 16:13:37 crc kubenswrapper[4743]: I1125 16:13:37.275544 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8t66" event={"ID":"f57a9957-1565-46cf-a737-2c6dd09e74f5","Type":"ContainerDied","Data":"dc78b27505ff085acde4a8ccaf94b7cb6bb7d355302bafa6baf79ff790a65f44"} Nov 25 16:13:37 crc kubenswrapper[4743]: I1125 16:13:37.299680 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-fwlrd" podStartSLOduration=25.690958788 podStartE2EDuration="34.299662285s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.028519596 +0000 UTC m=+864.150359145" lastFinishedPulling="2025-11-25 16:13:13.637223093 +0000 UTC m=+872.759062642" observedRunningTime="2025-11-25 16:13:37.293495121 +0000 UTC m=+896.415334690" watchObservedRunningTime="2025-11-25 16:13:37.299662285 +0000 UTC m=+896.421501834" Nov 25 16:13:37 crc kubenswrapper[4743]: I1125 16:13:37.351463 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg" podStartSLOduration=18.407885204 podStartE2EDuration="34.351415626s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.625053077 +0000 UTC m=+864.746892616" lastFinishedPulling="2025-11-25 16:13:21.568583489 +0000 UTC m=+880.690423038" observedRunningTime="2025-11-25 16:13:37.347464693 +0000 UTC m=+896.469304262" watchObservedRunningTime="2025-11-25 16:13:37.351415626 +0000 UTC m=+896.473255185" Nov 25 16:13:37 crc kubenswrapper[4743]: I1125 16:13:37.367982 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-zdjj7" podStartSLOduration=26.325770427 podStartE2EDuration="34.367962044s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.597410641 +0000 UTC m=+864.719250210" lastFinishedPulling="2025-11-25 16:13:13.639602278 +0000 UTC m=+872.761441827" observedRunningTime="2025-11-25 16:13:37.361212233 +0000 UTC m=+896.483051802" watchObservedRunningTime="2025-11-25 16:13:37.367962044 +0000 UTC m=+896.489801603" Nov 25 16:13:37 crc kubenswrapper[4743]: I1125 16:13:37.384635 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g4sq7" podStartSLOduration=9.713386859 podStartE2EDuration="33.384614966s" podCreationTimestamp="2025-11-25 16:13:04 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.633177771 +0000 UTC m=+864.755017320" lastFinishedPulling="2025-11-25 16:13:29.304405878 +0000 UTC m=+888.426245427" observedRunningTime="2025-11-25 16:13:37.378951508 +0000 UTC m=+896.500791057" watchObservedRunningTime="2025-11-25 16:13:37.384614966 +0000 UTC m=+896.506454515" Nov 25 16:13:37 crc kubenswrapper[4743]: I1125 16:13:37.405798 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9hc45" podStartSLOduration=9.742201231 podStartE2EDuration="19.40577981s" podCreationTimestamp="2025-11-25 16:13:18 +0000 UTC" firstStartedPulling="2025-11-25 16:13:20.344688992 +0000 UTC m=+879.466528541" lastFinishedPulling="2025-11-25 16:13:30.008267571 +0000 UTC m=+889.130107120" observedRunningTime="2025-11-25 16:13:37.404146538 +0000 UTC m=+896.525986087" watchObservedRunningTime="2025-11-25 16:13:37.40577981 +0000 UTC m=+896.527619349" Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.282413 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfw7g" event={"ID":"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9","Type":"ContainerStarted","Data":"c76185a6a31cc3336467f7fab58e6074ee99aa17aef666ab6e2bbd194f2112fd"} Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.284225 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-pjmtx" event={"ID":"5a86bde8-f04d-4bfa-842f-6c960d7232fb","Type":"ContainerStarted","Data":"82df777d1cf837a3530845b4f59f95ade85db7a3e7812271840bfc03d32d355f"} Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.284336 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-864885998-pjmtx" Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.286376 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr" event={"ID":"e2417720-74c0-4232-9f99-cdc10e485c91","Type":"ContainerStarted","Data":"ab7fdafd9b099f381a5b3790ded231363306a45d7ce4c7c64b1b13b12d64df67"} Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.286445 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr" Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.288558 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8" event={"ID":"d4e33a37-ac1e-408c-b0d3-a1352daa67af","Type":"ContainerStarted","Data":"82a9aed9c4677eea4f1a216716711b610b43107a98496796bb245de8ad445529"} Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.288766 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h" Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.288806 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8" Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.288826 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh" Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.300993 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-864885998-pjmtx" podStartSLOduration=9.942277752 podStartE2EDuration="34.300974008s" podCreationTimestamp="2025-11-25 16:13:04 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.632381306 +0000 UTC m=+864.754220855" lastFinishedPulling="2025-11-25 16:13:29.991077542 +0000 UTC m=+889.112917111" observedRunningTime="2025-11-25 16:13:38.297624552 +0000 UTC m=+897.419464111" watchObservedRunningTime="2025-11-25 16:13:38.300974008 +0000 UTC m=+897.422813557" Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.315204 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8" podStartSLOduration=10.927480148 podStartE2EDuration="35.315187193s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.620861815 +0000 UTC m=+864.742701364" lastFinishedPulling="2025-11-25 16:13:30.00856885 +0000 UTC m=+889.130408409" observedRunningTime="2025-11-25 16:13:38.313290794 +0000 UTC m=+897.435130353" watchObservedRunningTime="2025-11-25 16:13:38.315187193 +0000 UTC m=+897.437026742" Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.333077 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh" podStartSLOduration=10.99715531 podStartE2EDuration="35.333061823s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.632160289 +0000 UTC m=+864.753999838" lastFinishedPulling="2025-11-25 16:13:29.968066802 +0000 UTC m=+889.089906351" observedRunningTime="2025-11-25 16:13:38.327503229 +0000 UTC m=+897.449342778" watchObservedRunningTime="2025-11-25 16:13:38.333061823 +0000 UTC m=+897.454901372" Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.343495 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr" podStartSLOduration=10.979522538 podStartE2EDuration="35.3434789s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.604190442 +0000 UTC m=+864.726029991" lastFinishedPulling="2025-11-25 16:13:29.968146804 +0000 UTC m=+889.089986353" observedRunningTime="2025-11-25 16:13:38.339733212 +0000 UTC m=+897.461572771" watchObservedRunningTime="2025-11-25 16:13:38.3434789 +0000 UTC m=+897.465318449" Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.363664 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h" podStartSLOduration=10.96391741 podStartE2EDuration="35.363649662s" podCreationTimestamp="2025-11-25 16:13:03 +0000 UTC" firstStartedPulling="2025-11-25 16:13:05.612249655 +0000 UTC m=+864.734089204" lastFinishedPulling="2025-11-25 16:13:30.011981887 +0000 UTC m=+889.133821456" observedRunningTime="2025-11-25 16:13:38.35341012 +0000 UTC m=+897.475249679" watchObservedRunningTime="2025-11-25 16:13:38.363649662 +0000 UTC m=+897.485489211" Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.765200 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.765256 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:38 crc kubenswrapper[4743]: I1125 16:13:38.809439 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:39 crc kubenswrapper[4743]: I1125 16:13:39.295669 4743 generic.go:334] "Generic (PLEG): container finished" podID="e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" containerID="c76185a6a31cc3336467f7fab58e6074ee99aa17aef666ab6e2bbd194f2112fd" exitCode=0 Nov 25 16:13:39 crc kubenswrapper[4743]: I1125 16:13:39.295715 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfw7g" event={"ID":"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9","Type":"ContainerDied","Data":"c76185a6a31cc3336467f7fab58e6074ee99aa17aef666ab6e2bbd194f2112fd"} Nov 25 16:13:39 crc kubenswrapper[4743]: I1125 16:13:39.297585 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8t66" event={"ID":"f57a9957-1565-46cf-a737-2c6dd09e74f5","Type":"ContainerStarted","Data":"49fb83b84d17571735f99939a1b92387f01128dec8f314cb8be1ec5225a1b225"} Nov 25 16:13:40 crc kubenswrapper[4743]: I1125 16:13:40.305424 4743 generic.go:334] "Generic (PLEG): container finished" podID="f57a9957-1565-46cf-a737-2c6dd09e74f5" containerID="49fb83b84d17571735f99939a1b92387f01128dec8f314cb8be1ec5225a1b225" exitCode=0 Nov 25 16:13:40 crc kubenswrapper[4743]: I1125 16:13:40.305503 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8t66" event={"ID":"f57a9957-1565-46cf-a737-2c6dd09e74f5","Type":"ContainerDied","Data":"49fb83b84d17571735f99939a1b92387f01128dec8f314cb8be1ec5225a1b225"} Nov 25 16:13:40 crc kubenswrapper[4743]: I1125 16:13:40.308196 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfw7g" event={"ID":"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9","Type":"ContainerStarted","Data":"8f9fce249854dd5fe6d747c7354250506cb3f70744a43644223b28f59d203ff3"} Nov 25 16:13:41 crc kubenswrapper[4743]: I1125 16:13:41.317853 4743 generic.go:334] "Generic (PLEG): container finished" podID="e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" containerID="8f9fce249854dd5fe6d747c7354250506cb3f70744a43644223b28f59d203ff3" exitCode=0 Nov 25 16:13:41 crc kubenswrapper[4743]: I1125 16:13:41.317934 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfw7g" event={"ID":"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9","Type":"ContainerDied","Data":"8f9fce249854dd5fe6d747c7354250506cb3f70744a43644223b28f59d203ff3"} Nov 25 16:13:41 crc kubenswrapper[4743]: I1125 16:13:41.320226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8t66" event={"ID":"f57a9957-1565-46cf-a737-2c6dd09e74f5","Type":"ContainerStarted","Data":"c9a6b88de37afa4093ce341138e0d905f41b403ba9eb42050ca6c851a96201d1"} Nov 25 16:13:41 crc kubenswrapper[4743]: I1125 16:13:41.379163 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k8t66" podStartSLOduration=8.973728615 podStartE2EDuration="12.379137013s" podCreationTimestamp="2025-11-25 16:13:29 +0000 UTC" firstStartedPulling="2025-11-25 16:13:37.276928642 +0000 UTC m=+896.398768191" lastFinishedPulling="2025-11-25 16:13:40.68233704 +0000 UTC m=+899.804176589" observedRunningTime="2025-11-25 16:13:41.370354328 +0000 UTC m=+900.492193887" watchObservedRunningTime="2025-11-25 16:13:41.379137013 +0000 UTC m=+900.500976562" Nov 25 16:13:42 crc kubenswrapper[4743]: I1125 16:13:42.328812 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfw7g" event={"ID":"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9","Type":"ContainerStarted","Data":"52ffdfabe73a63ee887f2d2d2a9285ca7b9a8ab46aebb8235955a6ca403d44f5"} Nov 25 16:13:42 crc kubenswrapper[4743]: I1125 16:13:42.348123 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sfw7g" podStartSLOduration=7.844197519 podStartE2EDuration="10.348100062s" podCreationTimestamp="2025-11-25 16:13:32 +0000 UTC" firstStartedPulling="2025-11-25 16:13:39.297028266 +0000 UTC m=+898.418867815" lastFinishedPulling="2025-11-25 16:13:41.800930809 +0000 UTC m=+900.922770358" observedRunningTime="2025-11-25 16:13:42.343511448 +0000 UTC m=+901.465351007" watchObservedRunningTime="2025-11-25 16:13:42.348100062 +0000 UTC m=+901.469939611" Nov 25 16:13:42 crc kubenswrapper[4743]: I1125 16:13:42.998172 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:42 crc kubenswrapper[4743]: I1125 16:13:42.998455 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:44 crc kubenswrapper[4743]: I1125 16:13:44.032929 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sfw7g" podUID="e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" containerName="registry-server" probeResult="failure" output=< Nov 25 16:13:44 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 25 16:13:44 crc kubenswrapper[4743]: > Nov 25 16:13:44 crc kubenswrapper[4743]: I1125 16:13:44.043718 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-8s47h" Nov 25 16:13:44 crc kubenswrapper[4743]: I1125 16:13:44.226896 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg" Nov 25 16:13:44 crc kubenswrapper[4743]: I1125 16:13:44.229502 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-bxdjg" Nov 25 16:13:44 crc kubenswrapper[4743]: I1125 16:13:44.336357 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-smsbr" Nov 25 16:13:44 crc kubenswrapper[4743]: I1125 16:13:44.384278 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-8lsj8" Nov 25 16:13:44 crc kubenswrapper[4743]: I1125 16:13:44.412986 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cb74df96-xbxjh" Nov 25 16:13:44 crc kubenswrapper[4743]: I1125 16:13:44.760537 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-864885998-pjmtx" Nov 25 16:13:48 crc kubenswrapper[4743]: I1125 16:13:48.812679 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:48 crc kubenswrapper[4743]: I1125 16:13:48.858443 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hc45"] Nov 25 16:13:49 crc kubenswrapper[4743]: I1125 16:13:49.373388 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9hc45" podUID="1e373339-e739-4116-a575-239b6976769f" containerName="registry-server" containerID="cri-o://fa6dd7c35335aa5b760a675139ded637962e8ad967acc70786ac98ec2a31e5d1" gracePeriod=2 Nov 25 16:13:49 crc kubenswrapper[4743]: I1125 16:13:49.394526 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:49 crc kubenswrapper[4743]: I1125 16:13:49.394572 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:49 crc kubenswrapper[4743]: I1125 16:13:49.440550 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:49 crc kubenswrapper[4743]: I1125 16:13:49.761877 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:49 crc kubenswrapper[4743]: I1125 16:13:49.871432 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e373339-e739-4116-a575-239b6976769f-catalog-content\") pod \"1e373339-e739-4116-a575-239b6976769f\" (UID: \"1e373339-e739-4116-a575-239b6976769f\") " Nov 25 16:13:49 crc kubenswrapper[4743]: I1125 16:13:49.871537 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh5f7\" (UniqueName: \"kubernetes.io/projected/1e373339-e739-4116-a575-239b6976769f-kube-api-access-qh5f7\") pod \"1e373339-e739-4116-a575-239b6976769f\" (UID: \"1e373339-e739-4116-a575-239b6976769f\") " Nov 25 16:13:49 crc kubenswrapper[4743]: I1125 16:13:49.871582 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e373339-e739-4116-a575-239b6976769f-utilities\") pod \"1e373339-e739-4116-a575-239b6976769f\" (UID: \"1e373339-e739-4116-a575-239b6976769f\") " Nov 25 16:13:49 crc kubenswrapper[4743]: I1125 16:13:49.872450 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e373339-e739-4116-a575-239b6976769f-utilities" (OuterVolumeSpecName: "utilities") pod "1e373339-e739-4116-a575-239b6976769f" (UID: "1e373339-e739-4116-a575-239b6976769f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:13:49 crc kubenswrapper[4743]: I1125 16:13:49.877749 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e373339-e739-4116-a575-239b6976769f-kube-api-access-qh5f7" (OuterVolumeSpecName: "kube-api-access-qh5f7") pod "1e373339-e739-4116-a575-239b6976769f" (UID: "1e373339-e739-4116-a575-239b6976769f"). InnerVolumeSpecName "kube-api-access-qh5f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:13:49 crc kubenswrapper[4743]: I1125 16:13:49.887012 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e373339-e739-4116-a575-239b6976769f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e373339-e739-4116-a575-239b6976769f" (UID: "1e373339-e739-4116-a575-239b6976769f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:13:49 crc kubenswrapper[4743]: I1125 16:13:49.973314 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e373339-e739-4116-a575-239b6976769f-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:13:49 crc kubenswrapper[4743]: I1125 16:13:49.973362 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e373339-e739-4116-a575-239b6976769f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:13:49 crc kubenswrapper[4743]: I1125 16:13:49.973373 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh5f7\" (UniqueName: \"kubernetes.io/projected/1e373339-e739-4116-a575-239b6976769f-kube-api-access-qh5f7\") on node \"crc\" DevicePath \"\"" Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.077076 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.077137 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.384619 4743 generic.go:334] "Generic (PLEG): container finished" podID="1e373339-e739-4116-a575-239b6976769f" containerID="fa6dd7c35335aa5b760a675139ded637962e8ad967acc70786ac98ec2a31e5d1" exitCode=0 Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.384705 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hc45" Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.384759 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hc45" event={"ID":"1e373339-e739-4116-a575-239b6976769f","Type":"ContainerDied","Data":"fa6dd7c35335aa5b760a675139ded637962e8ad967acc70786ac98ec2a31e5d1"} Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.384798 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hc45" event={"ID":"1e373339-e739-4116-a575-239b6976769f","Type":"ContainerDied","Data":"38de740ac7fa83576f7a055e6ee95ae37b9b6e2c504205c34b699320a5ec6bab"} Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.384814 4743 scope.go:117] "RemoveContainer" containerID="fa6dd7c35335aa5b760a675139ded637962e8ad967acc70786ac98ec2a31e5d1" Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.410802 4743 scope.go:117] "RemoveContainer" containerID="7cc2619cdcdafaf2bff9dd45db9367ba0af640f1a0bdc4c8ecfd18507f524d4a" Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.416031 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hc45"] Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.422362 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hc45"] Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.432386 4743 scope.go:117] "RemoveContainer" containerID="9c82ea444a59353f4bc68f6358265c7dba51eb8aec6b682811a03e02eb8dd659" Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.443761 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.457990 4743 scope.go:117] "RemoveContainer" containerID="fa6dd7c35335aa5b760a675139ded637962e8ad967acc70786ac98ec2a31e5d1" Nov 25 16:13:50 crc kubenswrapper[4743]: E1125 16:13:50.458561 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa6dd7c35335aa5b760a675139ded637962e8ad967acc70786ac98ec2a31e5d1\": container with ID starting with fa6dd7c35335aa5b760a675139ded637962e8ad967acc70786ac98ec2a31e5d1 not found: ID does not exist" containerID="fa6dd7c35335aa5b760a675139ded637962e8ad967acc70786ac98ec2a31e5d1" Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.458628 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa6dd7c35335aa5b760a675139ded637962e8ad967acc70786ac98ec2a31e5d1"} err="failed to get container status \"fa6dd7c35335aa5b760a675139ded637962e8ad967acc70786ac98ec2a31e5d1\": rpc error: code = NotFound desc = could not find container \"fa6dd7c35335aa5b760a675139ded637962e8ad967acc70786ac98ec2a31e5d1\": container with ID starting with fa6dd7c35335aa5b760a675139ded637962e8ad967acc70786ac98ec2a31e5d1 not found: ID does not exist" Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.458655 4743 scope.go:117] "RemoveContainer" containerID="7cc2619cdcdafaf2bff9dd45db9367ba0af640f1a0bdc4c8ecfd18507f524d4a" Nov 25 16:13:50 crc kubenswrapper[4743]: E1125 16:13:50.458968 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc2619cdcdafaf2bff9dd45db9367ba0af640f1a0bdc4c8ecfd18507f524d4a\": container with ID starting with 7cc2619cdcdafaf2bff9dd45db9367ba0af640f1a0bdc4c8ecfd18507f524d4a not found: ID does not exist" containerID="7cc2619cdcdafaf2bff9dd45db9367ba0af640f1a0bdc4c8ecfd18507f524d4a" Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.459000 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc2619cdcdafaf2bff9dd45db9367ba0af640f1a0bdc4c8ecfd18507f524d4a"} err="failed to get container status \"7cc2619cdcdafaf2bff9dd45db9367ba0af640f1a0bdc4c8ecfd18507f524d4a\": rpc error: code = NotFound desc = could not find container \"7cc2619cdcdafaf2bff9dd45db9367ba0af640f1a0bdc4c8ecfd18507f524d4a\": container with ID starting with 7cc2619cdcdafaf2bff9dd45db9367ba0af640f1a0bdc4c8ecfd18507f524d4a not found: ID does not exist" Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.459014 4743 scope.go:117] "RemoveContainer" containerID="9c82ea444a59353f4bc68f6358265c7dba51eb8aec6b682811a03e02eb8dd659" Nov 25 16:13:50 crc kubenswrapper[4743]: E1125 16:13:50.459228 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c82ea444a59353f4bc68f6358265c7dba51eb8aec6b682811a03e02eb8dd659\": container with ID starting with 9c82ea444a59353f4bc68f6358265c7dba51eb8aec6b682811a03e02eb8dd659 not found: ID does not exist" containerID="9c82ea444a59353f4bc68f6358265c7dba51eb8aec6b682811a03e02eb8dd659" Nov 25 16:13:50 crc kubenswrapper[4743]: I1125 16:13:50.459293 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c82ea444a59353f4bc68f6358265c7dba51eb8aec6b682811a03e02eb8dd659"} err="failed to get container status \"9c82ea444a59353f4bc68f6358265c7dba51eb8aec6b682811a03e02eb8dd659\": rpc error: code = NotFound desc = could not find container \"9c82ea444a59353f4bc68f6358265c7dba51eb8aec6b682811a03e02eb8dd659\": container with ID starting with 9c82ea444a59353f4bc68f6358265c7dba51eb8aec6b682811a03e02eb8dd659 not found: ID does not exist" Nov 25 16:13:51 crc kubenswrapper[4743]: I1125 16:13:51.783781 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e373339-e739-4116-a575-239b6976769f" path="/var/lib/kubelet/pods/1e373339-e739-4116-a575-239b6976769f/volumes" Nov 25 16:13:51 crc kubenswrapper[4743]: I1125 16:13:51.814272 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k8t66"] Nov 25 16:13:52 crc kubenswrapper[4743]: I1125 16:13:52.398081 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k8t66" podUID="f57a9957-1565-46cf-a737-2c6dd09e74f5" containerName="registry-server" containerID="cri-o://c9a6b88de37afa4093ce341138e0d905f41b403ba9eb42050ca6c851a96201d1" gracePeriod=2 Nov 25 16:13:52 crc kubenswrapper[4743]: I1125 16:13:52.784442 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:52 crc kubenswrapper[4743]: I1125 16:13:52.911476 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7k9z\" (UniqueName: \"kubernetes.io/projected/f57a9957-1565-46cf-a737-2c6dd09e74f5-kube-api-access-g7k9z\") pod \"f57a9957-1565-46cf-a737-2c6dd09e74f5\" (UID: \"f57a9957-1565-46cf-a737-2c6dd09e74f5\") " Nov 25 16:13:52 crc kubenswrapper[4743]: I1125 16:13:52.911858 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f57a9957-1565-46cf-a737-2c6dd09e74f5-catalog-content\") pod \"f57a9957-1565-46cf-a737-2c6dd09e74f5\" (UID: \"f57a9957-1565-46cf-a737-2c6dd09e74f5\") " Nov 25 16:13:52 crc kubenswrapper[4743]: I1125 16:13:52.911996 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f57a9957-1565-46cf-a737-2c6dd09e74f5-utilities\") pod \"f57a9957-1565-46cf-a737-2c6dd09e74f5\" (UID: \"f57a9957-1565-46cf-a737-2c6dd09e74f5\") " Nov 25 16:13:52 crc kubenswrapper[4743]: I1125 16:13:52.912969 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f57a9957-1565-46cf-a737-2c6dd09e74f5-utilities" (OuterVolumeSpecName: "utilities") pod "f57a9957-1565-46cf-a737-2c6dd09e74f5" (UID: "f57a9957-1565-46cf-a737-2c6dd09e74f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:13:52 crc kubenswrapper[4743]: I1125 16:13:52.919138 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f57a9957-1565-46cf-a737-2c6dd09e74f5-kube-api-access-g7k9z" (OuterVolumeSpecName: "kube-api-access-g7k9z") pod "f57a9957-1565-46cf-a737-2c6dd09e74f5" (UID: "f57a9957-1565-46cf-a737-2c6dd09e74f5"). InnerVolumeSpecName "kube-api-access-g7k9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:13:52 crc kubenswrapper[4743]: I1125 16:13:52.960790 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f57a9957-1565-46cf-a737-2c6dd09e74f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f57a9957-1565-46cf-a737-2c6dd09e74f5" (UID: "f57a9957-1565-46cf-a737-2c6dd09e74f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.014056 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7k9z\" (UniqueName: \"kubernetes.io/projected/f57a9957-1565-46cf-a737-2c6dd09e74f5-kube-api-access-g7k9z\") on node \"crc\" DevicePath \"\"" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.014089 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f57a9957-1565-46cf-a737-2c6dd09e74f5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.014098 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f57a9957-1565-46cf-a737-2c6dd09e74f5-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.061289 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.105248 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.410748 4743 generic.go:334] "Generic (PLEG): container finished" podID="f57a9957-1565-46cf-a737-2c6dd09e74f5" containerID="c9a6b88de37afa4093ce341138e0d905f41b403ba9eb42050ca6c851a96201d1" exitCode=0 Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.410840 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8t66" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.410842 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8t66" event={"ID":"f57a9957-1565-46cf-a737-2c6dd09e74f5","Type":"ContainerDied","Data":"c9a6b88de37afa4093ce341138e0d905f41b403ba9eb42050ca6c851a96201d1"} Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.411293 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8t66" event={"ID":"f57a9957-1565-46cf-a737-2c6dd09e74f5","Type":"ContainerDied","Data":"a0dd89617247339adee8e63d7b45554e1cb858c77511468229e2827d91401ab0"} Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.411316 4743 scope.go:117] "RemoveContainer" containerID="c9a6b88de37afa4093ce341138e0d905f41b403ba9eb42050ca6c851a96201d1" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.437896 4743 scope.go:117] "RemoveContainer" containerID="49fb83b84d17571735f99939a1b92387f01128dec8f314cb8be1ec5225a1b225" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.459577 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k8t66"] Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.469328 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k8t66"] Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.482986 4743 scope.go:117] "RemoveContainer" containerID="dc78b27505ff085acde4a8ccaf94b7cb6bb7d355302bafa6baf79ff790a65f44" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.498881 4743 scope.go:117] "RemoveContainer" containerID="c9a6b88de37afa4093ce341138e0d905f41b403ba9eb42050ca6c851a96201d1" Nov 25 16:13:53 crc kubenswrapper[4743]: E1125 16:13:53.499220 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9a6b88de37afa4093ce341138e0d905f41b403ba9eb42050ca6c851a96201d1\": container with ID starting with c9a6b88de37afa4093ce341138e0d905f41b403ba9eb42050ca6c851a96201d1 not found: ID does not exist" containerID="c9a6b88de37afa4093ce341138e0d905f41b403ba9eb42050ca6c851a96201d1" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.499253 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9a6b88de37afa4093ce341138e0d905f41b403ba9eb42050ca6c851a96201d1"} err="failed to get container status \"c9a6b88de37afa4093ce341138e0d905f41b403ba9eb42050ca6c851a96201d1\": rpc error: code = NotFound desc = could not find container \"c9a6b88de37afa4093ce341138e0d905f41b403ba9eb42050ca6c851a96201d1\": container with ID starting with c9a6b88de37afa4093ce341138e0d905f41b403ba9eb42050ca6c851a96201d1 not found: ID does not exist" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.499278 4743 scope.go:117] "RemoveContainer" containerID="49fb83b84d17571735f99939a1b92387f01128dec8f314cb8be1ec5225a1b225" Nov 25 16:13:53 crc kubenswrapper[4743]: E1125 16:13:53.499534 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49fb83b84d17571735f99939a1b92387f01128dec8f314cb8be1ec5225a1b225\": container with ID starting with 49fb83b84d17571735f99939a1b92387f01128dec8f314cb8be1ec5225a1b225 not found: ID does not exist" containerID="49fb83b84d17571735f99939a1b92387f01128dec8f314cb8be1ec5225a1b225" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.499561 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49fb83b84d17571735f99939a1b92387f01128dec8f314cb8be1ec5225a1b225"} err="failed to get container status \"49fb83b84d17571735f99939a1b92387f01128dec8f314cb8be1ec5225a1b225\": rpc error: code = NotFound desc = could not find container \"49fb83b84d17571735f99939a1b92387f01128dec8f314cb8be1ec5225a1b225\": container with ID starting with 49fb83b84d17571735f99939a1b92387f01128dec8f314cb8be1ec5225a1b225 not found: ID does not exist" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.499582 4743 scope.go:117] "RemoveContainer" containerID="dc78b27505ff085acde4a8ccaf94b7cb6bb7d355302bafa6baf79ff790a65f44" Nov 25 16:13:53 crc kubenswrapper[4743]: E1125 16:13:53.499928 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc78b27505ff085acde4a8ccaf94b7cb6bb7d355302bafa6baf79ff790a65f44\": container with ID starting with dc78b27505ff085acde4a8ccaf94b7cb6bb7d355302bafa6baf79ff790a65f44 not found: ID does not exist" containerID="dc78b27505ff085acde4a8ccaf94b7cb6bb7d355302bafa6baf79ff790a65f44" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.499955 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc78b27505ff085acde4a8ccaf94b7cb6bb7d355302bafa6baf79ff790a65f44"} err="failed to get container status \"dc78b27505ff085acde4a8ccaf94b7cb6bb7d355302bafa6baf79ff790a65f44\": rpc error: code = NotFound desc = could not find container \"dc78b27505ff085acde4a8ccaf94b7cb6bb7d355302bafa6baf79ff790a65f44\": container with ID starting with dc78b27505ff085acde4a8ccaf94b7cb6bb7d355302bafa6baf79ff790a65f44 not found: ID does not exist" Nov 25 16:13:53 crc kubenswrapper[4743]: I1125 16:13:53.794824 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f57a9957-1565-46cf-a737-2c6dd09e74f5" path="/var/lib/kubelet/pods/f57a9957-1565-46cf-a737-2c6dd09e74f5/volumes" Nov 25 16:13:56 crc kubenswrapper[4743]: I1125 16:13:56.217866 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sfw7g"] Nov 25 16:13:56 crc kubenswrapper[4743]: I1125 16:13:56.219209 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sfw7g" podUID="e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" containerName="registry-server" containerID="cri-o://52ffdfabe73a63ee887f2d2d2a9285ca7b9a8ab46aebb8235955a6ca403d44f5" gracePeriod=2 Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.160637 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.277650 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd7bt\" (UniqueName: \"kubernetes.io/projected/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-kube-api-access-gd7bt\") pod \"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9\" (UID: \"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9\") " Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.277693 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-catalog-content\") pod \"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9\" (UID: \"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9\") " Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.277739 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-utilities\") pod \"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9\" (UID: \"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9\") " Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.278796 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-utilities" (OuterVolumeSpecName: "utilities") pod "e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" (UID: "e88d6fa2-edfb-40d6-89e7-4667d4a47ec9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.286425 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-kube-api-access-gd7bt" (OuterVolumeSpecName: "kube-api-access-gd7bt") pod "e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" (UID: "e88d6fa2-edfb-40d6-89e7-4667d4a47ec9"). InnerVolumeSpecName "kube-api-access-gd7bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.379681 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd7bt\" (UniqueName: \"kubernetes.io/projected/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-kube-api-access-gd7bt\") on node \"crc\" DevicePath \"\"" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.379724 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.387770 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" (UID: "e88d6fa2-edfb-40d6-89e7-4667d4a47ec9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.441884 4743 generic.go:334] "Generic (PLEG): container finished" podID="e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" containerID="52ffdfabe73a63ee887f2d2d2a9285ca7b9a8ab46aebb8235955a6ca403d44f5" exitCode=0 Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.441928 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfw7g" event={"ID":"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9","Type":"ContainerDied","Data":"52ffdfabe73a63ee887f2d2d2a9285ca7b9a8ab46aebb8235955a6ca403d44f5"} Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.441955 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sfw7g" event={"ID":"e88d6fa2-edfb-40d6-89e7-4667d4a47ec9","Type":"ContainerDied","Data":"c2b08e64626ef790beb1ac7412ce6f9d0ac89ffc23bea028b84cbfd652ef61d7"} Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.441976 4743 scope.go:117] "RemoveContainer" containerID="52ffdfabe73a63ee887f2d2d2a9285ca7b9a8ab46aebb8235955a6ca403d44f5" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.442003 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sfw7g" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.461695 4743 scope.go:117] "RemoveContainer" containerID="8f9fce249854dd5fe6d747c7354250506cb3f70744a43644223b28f59d203ff3" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.472600 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sfw7g"] Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.478861 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sfw7g"] Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.480860 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.495448 4743 scope.go:117] "RemoveContainer" containerID="c76185a6a31cc3336467f7fab58e6074ee99aa17aef666ab6e2bbd194f2112fd" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.512478 4743 scope.go:117] "RemoveContainer" containerID="52ffdfabe73a63ee887f2d2d2a9285ca7b9a8ab46aebb8235955a6ca403d44f5" Nov 25 16:13:57 crc kubenswrapper[4743]: E1125 16:13:57.512953 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ffdfabe73a63ee887f2d2d2a9285ca7b9a8ab46aebb8235955a6ca403d44f5\": container with ID starting with 52ffdfabe73a63ee887f2d2d2a9285ca7b9a8ab46aebb8235955a6ca403d44f5 not found: ID does not exist" containerID="52ffdfabe73a63ee887f2d2d2a9285ca7b9a8ab46aebb8235955a6ca403d44f5" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.513007 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ffdfabe73a63ee887f2d2d2a9285ca7b9a8ab46aebb8235955a6ca403d44f5"} err="failed to get container status \"52ffdfabe73a63ee887f2d2d2a9285ca7b9a8ab46aebb8235955a6ca403d44f5\": rpc error: code = NotFound desc = could not find container \"52ffdfabe73a63ee887f2d2d2a9285ca7b9a8ab46aebb8235955a6ca403d44f5\": container with ID starting with 52ffdfabe73a63ee887f2d2d2a9285ca7b9a8ab46aebb8235955a6ca403d44f5 not found: ID does not exist" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.513042 4743 scope.go:117] "RemoveContainer" containerID="8f9fce249854dd5fe6d747c7354250506cb3f70744a43644223b28f59d203ff3" Nov 25 16:13:57 crc kubenswrapper[4743]: E1125 16:13:57.513643 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f9fce249854dd5fe6d747c7354250506cb3f70744a43644223b28f59d203ff3\": container with ID starting with 8f9fce249854dd5fe6d747c7354250506cb3f70744a43644223b28f59d203ff3 not found: ID does not exist" containerID="8f9fce249854dd5fe6d747c7354250506cb3f70744a43644223b28f59d203ff3" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.513678 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9fce249854dd5fe6d747c7354250506cb3f70744a43644223b28f59d203ff3"} err="failed to get container status \"8f9fce249854dd5fe6d747c7354250506cb3f70744a43644223b28f59d203ff3\": rpc error: code = NotFound desc = could not find container \"8f9fce249854dd5fe6d747c7354250506cb3f70744a43644223b28f59d203ff3\": container with ID starting with 8f9fce249854dd5fe6d747c7354250506cb3f70744a43644223b28f59d203ff3 not found: ID does not exist" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.513727 4743 scope.go:117] "RemoveContainer" containerID="c76185a6a31cc3336467f7fab58e6074ee99aa17aef666ab6e2bbd194f2112fd" Nov 25 16:13:57 crc kubenswrapper[4743]: E1125 16:13:57.514059 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c76185a6a31cc3336467f7fab58e6074ee99aa17aef666ab6e2bbd194f2112fd\": container with ID starting with c76185a6a31cc3336467f7fab58e6074ee99aa17aef666ab6e2bbd194f2112fd not found: ID does not exist" containerID="c76185a6a31cc3336467f7fab58e6074ee99aa17aef666ab6e2bbd194f2112fd" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.514088 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c76185a6a31cc3336467f7fab58e6074ee99aa17aef666ab6e2bbd194f2112fd"} err="failed to get container status \"c76185a6a31cc3336467f7fab58e6074ee99aa17aef666ab6e2bbd194f2112fd\": rpc error: code = NotFound desc = could not find container \"c76185a6a31cc3336467f7fab58e6074ee99aa17aef666ab6e2bbd194f2112fd\": container with ID starting with c76185a6a31cc3336467f7fab58e6074ee99aa17aef666ab6e2bbd194f2112fd not found: ID does not exist" Nov 25 16:13:57 crc kubenswrapper[4743]: I1125 16:13:57.788211 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" path="/var/lib/kubelet/pods/e88d6fa2-edfb-40d6-89e7-4667d4a47ec9/volumes" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.869948 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c552s"] Nov 25 16:13:59 crc kubenswrapper[4743]: E1125 16:13:59.870520 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f57a9957-1565-46cf-a737-2c6dd09e74f5" containerName="extract-utilities" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.870535 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f57a9957-1565-46cf-a737-2c6dd09e74f5" containerName="extract-utilities" Nov 25 16:13:59 crc kubenswrapper[4743]: E1125 16:13:59.870580 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" containerName="registry-server" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.870610 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" containerName="registry-server" Nov 25 16:13:59 crc kubenswrapper[4743]: E1125 16:13:59.870623 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e373339-e739-4116-a575-239b6976769f" containerName="registry-server" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.870631 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e373339-e739-4116-a575-239b6976769f" containerName="registry-server" Nov 25 16:13:59 crc kubenswrapper[4743]: E1125 16:13:59.870652 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e373339-e739-4116-a575-239b6976769f" containerName="extract-content" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.870658 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e373339-e739-4116-a575-239b6976769f" containerName="extract-content" Nov 25 16:13:59 crc kubenswrapper[4743]: E1125 16:13:59.870669 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" containerName="extract-content" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.870675 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" containerName="extract-content" Nov 25 16:13:59 crc kubenswrapper[4743]: E1125 16:13:59.870695 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e373339-e739-4116-a575-239b6976769f" containerName="extract-utilities" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.870701 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e373339-e739-4116-a575-239b6976769f" containerName="extract-utilities" Nov 25 16:13:59 crc kubenswrapper[4743]: E1125 16:13:59.870716 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f57a9957-1565-46cf-a737-2c6dd09e74f5" containerName="registry-server" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.870721 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f57a9957-1565-46cf-a737-2c6dd09e74f5" containerName="registry-server" Nov 25 16:13:59 crc kubenswrapper[4743]: E1125 16:13:59.870733 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" containerName="extract-utilities" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.870740 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" containerName="extract-utilities" Nov 25 16:13:59 crc kubenswrapper[4743]: E1125 16:13:59.870753 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f57a9957-1565-46cf-a737-2c6dd09e74f5" containerName="extract-content" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.870760 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f57a9957-1565-46cf-a737-2c6dd09e74f5" containerName="extract-content" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.870910 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e88d6fa2-edfb-40d6-89e7-4667d4a47ec9" containerName="registry-server" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.870925 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e373339-e739-4116-a575-239b6976769f" containerName="registry-server" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.870935 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f57a9957-1565-46cf-a737-2c6dd09e74f5" containerName="registry-server" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.871662 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-c552s" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.879337 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.879475 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.879532 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.884805 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c552s"] Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.886106 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-hgbhj" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.926065 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sgdfb"] Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.927209 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.930227 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 25 16:13:59 crc kubenswrapper[4743]: I1125 16:13:59.946490 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sgdfb"] Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.010468 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkst8\" (UniqueName: \"kubernetes.io/projected/0f91b710-af26-4e54-a350-1ff3aaec211b-kube-api-access-wkst8\") pod \"dnsmasq-dns-675f4bcbfc-c552s\" (UID: \"0f91b710-af26-4e54-a350-1ff3aaec211b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c552s" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.010514 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d5517ba-a17f-482e-bea0-533e24a259a9-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sgdfb\" (UID: \"1d5517ba-a17f-482e-bea0-533e24a259a9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.010715 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5517ba-a17f-482e-bea0-533e24a259a9-config\") pod \"dnsmasq-dns-78dd6ddcc-sgdfb\" (UID: \"1d5517ba-a17f-482e-bea0-533e24a259a9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.010798 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f91b710-af26-4e54-a350-1ff3aaec211b-config\") pod \"dnsmasq-dns-675f4bcbfc-c552s\" (UID: \"0f91b710-af26-4e54-a350-1ff3aaec211b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c552s" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.010878 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fk6h\" (UniqueName: \"kubernetes.io/projected/1d5517ba-a17f-482e-bea0-533e24a259a9-kube-api-access-8fk6h\") pod \"dnsmasq-dns-78dd6ddcc-sgdfb\" (UID: \"1d5517ba-a17f-482e-bea0-533e24a259a9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.112201 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkst8\" (UniqueName: \"kubernetes.io/projected/0f91b710-af26-4e54-a350-1ff3aaec211b-kube-api-access-wkst8\") pod \"dnsmasq-dns-675f4bcbfc-c552s\" (UID: \"0f91b710-af26-4e54-a350-1ff3aaec211b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c552s" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.112455 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d5517ba-a17f-482e-bea0-533e24a259a9-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sgdfb\" (UID: \"1d5517ba-a17f-482e-bea0-533e24a259a9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.112580 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5517ba-a17f-482e-bea0-533e24a259a9-config\") pod \"dnsmasq-dns-78dd6ddcc-sgdfb\" (UID: \"1d5517ba-a17f-482e-bea0-533e24a259a9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.112707 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f91b710-af26-4e54-a350-1ff3aaec211b-config\") pod \"dnsmasq-dns-675f4bcbfc-c552s\" (UID: \"0f91b710-af26-4e54-a350-1ff3aaec211b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c552s" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.112802 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fk6h\" (UniqueName: \"kubernetes.io/projected/1d5517ba-a17f-482e-bea0-533e24a259a9-kube-api-access-8fk6h\") pod \"dnsmasq-dns-78dd6ddcc-sgdfb\" (UID: \"1d5517ba-a17f-482e-bea0-533e24a259a9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.113446 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d5517ba-a17f-482e-bea0-533e24a259a9-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sgdfb\" (UID: \"1d5517ba-a17f-482e-bea0-533e24a259a9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.113507 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f91b710-af26-4e54-a350-1ff3aaec211b-config\") pod \"dnsmasq-dns-675f4bcbfc-c552s\" (UID: \"0f91b710-af26-4e54-a350-1ff3aaec211b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c552s" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.113560 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5517ba-a17f-482e-bea0-533e24a259a9-config\") pod \"dnsmasq-dns-78dd6ddcc-sgdfb\" (UID: \"1d5517ba-a17f-482e-bea0-533e24a259a9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.133718 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fk6h\" (UniqueName: \"kubernetes.io/projected/1d5517ba-a17f-482e-bea0-533e24a259a9-kube-api-access-8fk6h\") pod \"dnsmasq-dns-78dd6ddcc-sgdfb\" (UID: \"1d5517ba-a17f-482e-bea0-533e24a259a9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.133785 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkst8\" (UniqueName: \"kubernetes.io/projected/0f91b710-af26-4e54-a350-1ff3aaec211b-kube-api-access-wkst8\") pod \"dnsmasq-dns-675f4bcbfc-c552s\" (UID: \"0f91b710-af26-4e54-a350-1ff3aaec211b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c552s" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.188991 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-c552s" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.243558 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.634729 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c552s"] Nov 25 16:14:00 crc kubenswrapper[4743]: W1125 16:14:00.637845 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f91b710_af26_4e54_a350_1ff3aaec211b.slice/crio-9b8bfbef9e04b1dfc06e079268cd6c8ce5fd634cb37db0116e66d434aa1e7355 WatchSource:0}: Error finding container 9b8bfbef9e04b1dfc06e079268cd6c8ce5fd634cb37db0116e66d434aa1e7355: Status 404 returned error can't find the container with id 9b8bfbef9e04b1dfc06e079268cd6c8ce5fd634cb37db0116e66d434aa1e7355 Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.639695 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 16:14:00 crc kubenswrapper[4743]: W1125 16:14:00.702233 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d5517ba_a17f_482e_bea0_533e24a259a9.slice/crio-20cc117df6af3fa8d96f1645072cf7b5578f843fd412b69df76cd3435f451e7b WatchSource:0}: Error finding container 20cc117df6af3fa8d96f1645072cf7b5578f843fd412b69df76cd3435f451e7b: Status 404 returned error can't find the container with id 20cc117df6af3fa8d96f1645072cf7b5578f843fd412b69df76cd3435f451e7b Nov 25 16:14:00 crc kubenswrapper[4743]: I1125 16:14:00.704309 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sgdfb"] Nov 25 16:14:01 crc kubenswrapper[4743]: I1125 16:14:01.470463 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-c552s" event={"ID":"0f91b710-af26-4e54-a350-1ff3aaec211b","Type":"ContainerStarted","Data":"9b8bfbef9e04b1dfc06e079268cd6c8ce5fd634cb37db0116e66d434aa1e7355"} Nov 25 16:14:01 crc kubenswrapper[4743]: I1125 16:14:01.472135 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" event={"ID":"1d5517ba-a17f-482e-bea0-533e24a259a9","Type":"ContainerStarted","Data":"20cc117df6af3fa8d96f1645072cf7b5578f843fd412b69df76cd3435f451e7b"} Nov 25 16:14:02 crc kubenswrapper[4743]: I1125 16:14:02.899368 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c552s"] Nov 25 16:14:02 crc kubenswrapper[4743]: I1125 16:14:02.935462 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-s8tgf"] Nov 25 16:14:02 crc kubenswrapper[4743]: I1125 16:14:02.936850 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" Nov 25 16:14:02 crc kubenswrapper[4743]: I1125 16:14:02.941105 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-s8tgf"] Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.055767 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhp6p\" (UniqueName: \"kubernetes.io/projected/d1d629a2-dc68-4402-b4b9-7d9da6214e50-kube-api-access-fhp6p\") pod \"dnsmasq-dns-666b6646f7-s8tgf\" (UID: \"d1d629a2-dc68-4402-b4b9-7d9da6214e50\") " pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.055865 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1d629a2-dc68-4402-b4b9-7d9da6214e50-dns-svc\") pod \"dnsmasq-dns-666b6646f7-s8tgf\" (UID: \"d1d629a2-dc68-4402-b4b9-7d9da6214e50\") " pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.055920 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d629a2-dc68-4402-b4b9-7d9da6214e50-config\") pod \"dnsmasq-dns-666b6646f7-s8tgf\" (UID: \"d1d629a2-dc68-4402-b4b9-7d9da6214e50\") " pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.156933 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d629a2-dc68-4402-b4b9-7d9da6214e50-config\") pod \"dnsmasq-dns-666b6646f7-s8tgf\" (UID: \"d1d629a2-dc68-4402-b4b9-7d9da6214e50\") " pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.157010 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhp6p\" (UniqueName: \"kubernetes.io/projected/d1d629a2-dc68-4402-b4b9-7d9da6214e50-kube-api-access-fhp6p\") pod \"dnsmasq-dns-666b6646f7-s8tgf\" (UID: \"d1d629a2-dc68-4402-b4b9-7d9da6214e50\") " pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.157080 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1d629a2-dc68-4402-b4b9-7d9da6214e50-dns-svc\") pod \"dnsmasq-dns-666b6646f7-s8tgf\" (UID: \"d1d629a2-dc68-4402-b4b9-7d9da6214e50\") " pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.158115 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1d629a2-dc68-4402-b4b9-7d9da6214e50-dns-svc\") pod \"dnsmasq-dns-666b6646f7-s8tgf\" (UID: \"d1d629a2-dc68-4402-b4b9-7d9da6214e50\") " pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.158776 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d629a2-dc68-4402-b4b9-7d9da6214e50-config\") pod \"dnsmasq-dns-666b6646f7-s8tgf\" (UID: \"d1d629a2-dc68-4402-b4b9-7d9da6214e50\") " pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.193134 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhp6p\" (UniqueName: \"kubernetes.io/projected/d1d629a2-dc68-4402-b4b9-7d9da6214e50-kube-api-access-fhp6p\") pod \"dnsmasq-dns-666b6646f7-s8tgf\" (UID: \"d1d629a2-dc68-4402-b4b9-7d9da6214e50\") " pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.263895 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.309360 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sgdfb"] Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.366312 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8jpbt"] Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.367654 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.407656 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8jpbt"] Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.464779 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-config\") pod \"dnsmasq-dns-57d769cc4f-8jpbt\" (UID: \"db79aaa9-4b33-4db5-991f-9a4b3aee84ae\") " pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.464892 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4lmb\" (UniqueName: \"kubernetes.io/projected/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-kube-api-access-t4lmb\") pod \"dnsmasq-dns-57d769cc4f-8jpbt\" (UID: \"db79aaa9-4b33-4db5-991f-9a4b3aee84ae\") " pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.464916 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8jpbt\" (UID: \"db79aaa9-4b33-4db5-991f-9a4b3aee84ae\") " pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.571293 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-config\") pod \"dnsmasq-dns-57d769cc4f-8jpbt\" (UID: \"db79aaa9-4b33-4db5-991f-9a4b3aee84ae\") " pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.571394 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4lmb\" (UniqueName: \"kubernetes.io/projected/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-kube-api-access-t4lmb\") pod \"dnsmasq-dns-57d769cc4f-8jpbt\" (UID: \"db79aaa9-4b33-4db5-991f-9a4b3aee84ae\") " pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.571417 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8jpbt\" (UID: \"db79aaa9-4b33-4db5-991f-9a4b3aee84ae\") " pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.572233 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8jpbt\" (UID: \"db79aaa9-4b33-4db5-991f-9a4b3aee84ae\") " pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.572439 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-config\") pod \"dnsmasq-dns-57d769cc4f-8jpbt\" (UID: \"db79aaa9-4b33-4db5-991f-9a4b3aee84ae\") " pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.594645 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4lmb\" (UniqueName: \"kubernetes.io/projected/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-kube-api-access-t4lmb\") pod \"dnsmasq-dns-57d769cc4f-8jpbt\" (UID: \"db79aaa9-4b33-4db5-991f-9a4b3aee84ae\") " pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.700271 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" Nov 25 16:14:03 crc kubenswrapper[4743]: I1125 16:14:03.867188 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-s8tgf"] Nov 25 16:14:03 crc kubenswrapper[4743]: W1125 16:14:03.871056 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1d629a2_dc68_4402_b4b9_7d9da6214e50.slice/crio-d227781455495179ac10f8d5526753a8902b41cf8766b6d366845c892e7e2ead WatchSource:0}: Error finding container d227781455495179ac10f8d5526753a8902b41cf8766b6d366845c892e7e2ead: Status 404 returned error can't find the container with id d227781455495179ac10f8d5526753a8902b41cf8766b6d366845c892e7e2ead Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.185095 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8jpbt"] Nov 25 16:14:04 crc kubenswrapper[4743]: W1125 16:14:04.199376 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb79aaa9_4b33_4db5_991f_9a4b3aee84ae.slice/crio-74d074321693d925aa187b166738ec4c824e7d4f8312e4b19c8139eebdbb0483 WatchSource:0}: Error finding container 74d074321693d925aa187b166738ec4c824e7d4f8312e4b19c8139eebdbb0483: Status 404 returned error can't find the container with id 74d074321693d925aa187b166738ec4c824e7d4f8312e4b19c8139eebdbb0483 Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.298520 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.299873 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.301804 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.301910 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.304585 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.304676 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.313715 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2dmzd" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.315203 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.320917 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.336128 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.384226 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.384274 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/99b737b1-8d17-4abc-a898-1ceedff80421-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.384340 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-config-data\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.384368 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qpvg\" (UniqueName: \"kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-kube-api-access-8qpvg\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.384401 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.384425 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-server-conf\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.384444 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.384482 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.384507 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/99b737b1-8d17-4abc-a898-1ceedff80421-pod-info\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.384531 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.384552 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.489310 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.489908 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/99b737b1-8d17-4abc-a898-1ceedff80421-pod-info\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.490059 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.490223 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.490363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.490472 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/99b737b1-8d17-4abc-a898-1ceedff80421-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.490639 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-config-data\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.490756 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qpvg\" (UniqueName: \"kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-kube-api-access-8qpvg\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.490867 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.490980 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-server-conf\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.491074 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.490638 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.491261 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.491337 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.492706 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-server-conf\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.495222 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-config-data\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.496401 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.499317 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/99b737b1-8d17-4abc-a898-1ceedff80421-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.501867 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.502242 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/99b737b1-8d17-4abc-a898-1ceedff80421-pod-info\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.502948 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.512298 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qpvg\" (UniqueName: \"kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-kube-api-access-8qpvg\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.512809 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.514525 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.518090 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.518332 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.518486 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.519624 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.519787 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-zzg9l" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.519958 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.520088 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.534500 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.549557 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.551299 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" event={"ID":"d1d629a2-dc68-4402-b4b9-7d9da6214e50","Type":"ContainerStarted","Data":"d227781455495179ac10f8d5526753a8902b41cf8766b6d366845c892e7e2ead"} Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.558924 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" event={"ID":"db79aaa9-4b33-4db5-991f-9a4b3aee84ae","Type":"ContainerStarted","Data":"74d074321693d925aa187b166738ec4c824e7d4f8312e4b19c8139eebdbb0483"} Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.594155 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.594204 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.594222 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/32600c5f-46d2-441f-bda1-2ca9e0c35f35-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.594238 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6rdt\" (UniqueName: \"kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-kube-api-access-f6rdt\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.594267 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.594285 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.594300 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.594334 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.594350 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.594422 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.594443 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/32600c5f-46d2-441f-bda1-2ca9e0c35f35-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.644064 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.698073 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.698123 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.698147 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.698178 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/32600c5f-46d2-441f-bda1-2ca9e0c35f35-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.698243 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.698270 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.698284 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/32600c5f-46d2-441f-bda1-2ca9e0c35f35-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.698340 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6rdt\" (UniqueName: \"kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-kube-api-access-f6rdt\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.698369 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.698388 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.698402 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.699242 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.699480 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.699526 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.699673 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.700392 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.700419 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.703835 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/32600c5f-46d2-441f-bda1-2ca9e0c35f35-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.704457 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/32600c5f-46d2-441f-bda1-2ca9e0c35f35-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.706688 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.707544 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.720005 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6rdt\" (UniqueName: \"kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-kube-api-access-f6rdt\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.723638 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:04 crc kubenswrapper[4743]: I1125 16:14:04.903228 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.640416 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.642187 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.644983 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.645167 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.650804 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.651047 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-48lpf" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.656460 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.657907 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.710913 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54e0104-81dc-49fc-9233-135bf00032be-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.710968 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkhtb\" (UniqueName: \"kubernetes.io/projected/e54e0104-81dc-49fc-9233-135bf00032be-kube-api-access-tkhtb\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.710999 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54e0104-81dc-49fc-9233-135bf00032be-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.711098 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e54e0104-81dc-49fc-9233-135bf00032be-kolla-config\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.711177 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.711243 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e54e0104-81dc-49fc-9233-135bf00032be-config-data-default\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.711270 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e54e0104-81dc-49fc-9233-135bf00032be-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.711360 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e54e0104-81dc-49fc-9233-135bf00032be-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.812791 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54e0104-81dc-49fc-9233-135bf00032be-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.812872 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkhtb\" (UniqueName: \"kubernetes.io/projected/e54e0104-81dc-49fc-9233-135bf00032be-kube-api-access-tkhtb\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.812965 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54e0104-81dc-49fc-9233-135bf00032be-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.813010 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e54e0104-81dc-49fc-9233-135bf00032be-kolla-config\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.813047 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.813081 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e54e0104-81dc-49fc-9233-135bf00032be-config-data-default\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.813103 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e54e0104-81dc-49fc-9233-135bf00032be-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.813189 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e54e0104-81dc-49fc-9233-135bf00032be-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.815233 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e54e0104-81dc-49fc-9233-135bf00032be-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.815370 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e54e0104-81dc-49fc-9233-135bf00032be-kolla-config\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.815453 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.817490 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e54e0104-81dc-49fc-9233-135bf00032be-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.818391 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e54e0104-81dc-49fc-9233-135bf00032be-config-data-default\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.819727 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e54e0104-81dc-49fc-9233-135bf00032be-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.820023 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e54e0104-81dc-49fc-9233-135bf00032be-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.838335 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkhtb\" (UniqueName: \"kubernetes.io/projected/e54e0104-81dc-49fc-9233-135bf00032be-kube-api-access-tkhtb\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.842205 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"e54e0104-81dc-49fc-9233-135bf00032be\") " pod="openstack/openstack-galera-0" Nov 25 16:14:05 crc kubenswrapper[4743]: I1125 16:14:05.968811 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.096144 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.098045 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.099810 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.100877 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.103703 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-2x7h9" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.104275 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.140520 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.145081 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.145176 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.145213 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.145236 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.145303 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.145371 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.145481 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.145540 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcskc\" (UniqueName: \"kubernetes.io/projected/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-kube-api-access-bcskc\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.246895 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.246968 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.246993 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.247019 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.247057 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.247095 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.247116 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.247136 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcskc\" (UniqueName: \"kubernetes.io/projected/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-kube-api-access-bcskc\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.247735 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.248390 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.248433 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.248880 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.249118 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.253712 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.265813 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.267552 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.268197 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcskc\" (UniqueName: \"kubernetes.io/projected/e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d-kube-api-access-bcskc\") pod \"openstack-cell1-galera-0\" (UID: \"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d\") " pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.356812 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.357866 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.359950 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.360519 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.361243 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-6fxm7" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.371416 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.444027 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.449193 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30545138-1305-45e8-9225-386065312213-combined-ca-bundle\") pod \"memcached-0\" (UID: \"30545138-1305-45e8-9225-386065312213\") " pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.449240 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30545138-1305-45e8-9225-386065312213-kolla-config\") pod \"memcached-0\" (UID: \"30545138-1305-45e8-9225-386065312213\") " pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.449298 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30545138-1305-45e8-9225-386065312213-config-data\") pod \"memcached-0\" (UID: \"30545138-1305-45e8-9225-386065312213\") " pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.449328 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6txr\" (UniqueName: \"kubernetes.io/projected/30545138-1305-45e8-9225-386065312213-kube-api-access-h6txr\") pod \"memcached-0\" (UID: \"30545138-1305-45e8-9225-386065312213\") " pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.449364 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/30545138-1305-45e8-9225-386065312213-memcached-tls-certs\") pod \"memcached-0\" (UID: \"30545138-1305-45e8-9225-386065312213\") " pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.550292 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30545138-1305-45e8-9225-386065312213-config-data\") pod \"memcached-0\" (UID: \"30545138-1305-45e8-9225-386065312213\") " pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.550349 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6txr\" (UniqueName: \"kubernetes.io/projected/30545138-1305-45e8-9225-386065312213-kube-api-access-h6txr\") pod \"memcached-0\" (UID: \"30545138-1305-45e8-9225-386065312213\") " pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.550390 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/30545138-1305-45e8-9225-386065312213-memcached-tls-certs\") pod \"memcached-0\" (UID: \"30545138-1305-45e8-9225-386065312213\") " pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.550424 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30545138-1305-45e8-9225-386065312213-combined-ca-bundle\") pod \"memcached-0\" (UID: \"30545138-1305-45e8-9225-386065312213\") " pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.550439 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30545138-1305-45e8-9225-386065312213-kolla-config\") pod \"memcached-0\" (UID: \"30545138-1305-45e8-9225-386065312213\") " pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.551510 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30545138-1305-45e8-9225-386065312213-kolla-config\") pod \"memcached-0\" (UID: \"30545138-1305-45e8-9225-386065312213\") " pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.552040 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30545138-1305-45e8-9225-386065312213-config-data\") pod \"memcached-0\" (UID: \"30545138-1305-45e8-9225-386065312213\") " pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.559075 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/30545138-1305-45e8-9225-386065312213-memcached-tls-certs\") pod \"memcached-0\" (UID: \"30545138-1305-45e8-9225-386065312213\") " pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.559222 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30545138-1305-45e8-9225-386065312213-combined-ca-bundle\") pod \"memcached-0\" (UID: \"30545138-1305-45e8-9225-386065312213\") " pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.570919 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6txr\" (UniqueName: \"kubernetes.io/projected/30545138-1305-45e8-9225-386065312213-kube-api-access-h6txr\") pod \"memcached-0\" (UID: \"30545138-1305-45e8-9225-386065312213\") " pod="openstack/memcached-0" Nov 25 16:14:07 crc kubenswrapper[4743]: I1125 16:14:07.674100 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 25 16:14:10 crc kubenswrapper[4743]: I1125 16:14:10.495239 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 16:14:10 crc kubenswrapper[4743]: I1125 16:14:10.496468 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 16:14:10 crc kubenswrapper[4743]: I1125 16:14:10.499787 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-s2ms5" Nov 25 16:14:10 crc kubenswrapper[4743]: I1125 16:14:10.509562 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 16:14:10 crc kubenswrapper[4743]: I1125 16:14:10.606533 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7mzq\" (UniqueName: \"kubernetes.io/projected/8337526c-dedb-4e1a-b73e-a9c37c6e6927-kube-api-access-w7mzq\") pod \"kube-state-metrics-0\" (UID: \"8337526c-dedb-4e1a-b73e-a9c37c6e6927\") " pod="openstack/kube-state-metrics-0" Nov 25 16:14:10 crc kubenswrapper[4743]: I1125 16:14:10.707903 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7mzq\" (UniqueName: \"kubernetes.io/projected/8337526c-dedb-4e1a-b73e-a9c37c6e6927-kube-api-access-w7mzq\") pod \"kube-state-metrics-0\" (UID: \"8337526c-dedb-4e1a-b73e-a9c37c6e6927\") " pod="openstack/kube-state-metrics-0" Nov 25 16:14:10 crc kubenswrapper[4743]: I1125 16:14:10.747041 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7mzq\" (UniqueName: \"kubernetes.io/projected/8337526c-dedb-4e1a-b73e-a9c37c6e6927-kube-api-access-w7mzq\") pod \"kube-state-metrics-0\" (UID: \"8337526c-dedb-4e1a-b73e-a9c37c6e6927\") " pod="openstack/kube-state-metrics-0" Nov 25 16:14:10 crc kubenswrapper[4743]: I1125 16:14:10.812167 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 16:14:13 crc kubenswrapper[4743]: I1125 16:14:13.942427 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8dtsl"] Nov 25 16:14:13 crc kubenswrapper[4743]: I1125 16:14:13.944266 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:13 crc kubenswrapper[4743]: I1125 16:14:13.946569 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 25 16:14:13 crc kubenswrapper[4743]: I1125 16:14:13.946782 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 25 16:14:13 crc kubenswrapper[4743]: I1125 16:14:13.947029 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-6hsvt" Nov 25 16:14:13 crc kubenswrapper[4743]: I1125 16:14:13.962934 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8dtsl"] Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.021852 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-lmnwx"] Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.031150 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.041272 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-lmnwx"] Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.056783 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7750901a-7566-4d94-8cb5-5aff66e22116-var-log-ovn\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.056846 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx2r8\" (UniqueName: \"kubernetes.io/projected/7750901a-7566-4d94-8cb5-5aff66e22116-kube-api-access-qx2r8\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.056869 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-scripts\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.056899 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7750901a-7566-4d94-8cb5-5aff66e22116-ovn-controller-tls-certs\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.056958 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-etc-ovs\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.058126 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7750901a-7566-4d94-8cb5-5aff66e22116-combined-ca-bundle\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.058185 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-var-log\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.058213 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7750901a-7566-4d94-8cb5-5aff66e22116-scripts\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.058258 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7750901a-7566-4d94-8cb5-5aff66e22116-var-run-ovn\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.058285 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7750901a-7566-4d94-8cb5-5aff66e22116-var-run\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.058333 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-var-run\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.058361 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-var-lib\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.058397 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smnnn\" (UniqueName: \"kubernetes.io/projected/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-kube-api-access-smnnn\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.159729 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7750901a-7566-4d94-8cb5-5aff66e22116-var-log-ovn\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.159793 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx2r8\" (UniqueName: \"kubernetes.io/projected/7750901a-7566-4d94-8cb5-5aff66e22116-kube-api-access-qx2r8\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.159822 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-scripts\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.159857 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7750901a-7566-4d94-8cb5-5aff66e22116-ovn-controller-tls-certs\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.159883 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-etc-ovs\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.159913 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7750901a-7566-4d94-8cb5-5aff66e22116-combined-ca-bundle\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.159933 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-var-log\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.159954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7750901a-7566-4d94-8cb5-5aff66e22116-scripts\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.159976 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7750901a-7566-4d94-8cb5-5aff66e22116-var-run-ovn\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.159997 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7750901a-7566-4d94-8cb5-5aff66e22116-var-run\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.160028 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-var-run\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.160062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-var-lib\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.160080 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smnnn\" (UniqueName: \"kubernetes.io/projected/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-kube-api-access-smnnn\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.160280 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7750901a-7566-4d94-8cb5-5aff66e22116-var-log-ovn\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.160457 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-etc-ovs\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.161225 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7750901a-7566-4d94-8cb5-5aff66e22116-var-run-ovn\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.161322 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7750901a-7566-4d94-8cb5-5aff66e22116-var-run\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.161376 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-var-run\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.161525 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-var-lib\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.162010 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-var-log\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.162130 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7750901a-7566-4d94-8cb5-5aff66e22116-scripts\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.163365 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-scripts\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.167468 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7750901a-7566-4d94-8cb5-5aff66e22116-combined-ca-bundle\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.169832 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7750901a-7566-4d94-8cb5-5aff66e22116-ovn-controller-tls-certs\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.177784 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx2r8\" (UniqueName: \"kubernetes.io/projected/7750901a-7566-4d94-8cb5-5aff66e22116-kube-api-access-qx2r8\") pod \"ovn-controller-8dtsl\" (UID: \"7750901a-7566-4d94-8cb5-5aff66e22116\") " pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.178092 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smnnn\" (UniqueName: \"kubernetes.io/projected/ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc-kube-api-access-smnnn\") pod \"ovn-controller-ovs-lmnwx\" (UID: \"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc\") " pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.269679 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:14 crc kubenswrapper[4743]: I1125 16:14:14.351321 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:16 crc kubenswrapper[4743]: E1125 16:14:16.405409 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 25 16:14:16 crc kubenswrapper[4743]: E1125 16:14:16.405889 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wkst8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-c552s_openstack(0f91b710-af26-4e54-a350-1ff3aaec211b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 16:14:16 crc kubenswrapper[4743]: E1125 16:14:16.407059 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-c552s" podUID="0f91b710-af26-4e54-a350-1ff3aaec211b" Nov 25 16:14:16 crc kubenswrapper[4743]: E1125 16:14:16.561573 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 25 16:14:16 crc kubenswrapper[4743]: E1125 16:14:16.561763 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8fk6h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-sgdfb_openstack(1d5517ba-a17f-482e-bea0-533e24a259a9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 16:14:16 crc kubenswrapper[4743]: E1125 16:14:16.563039 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" podUID="1d5517ba-a17f-482e-bea0-533e24a259a9" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.119158 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.211852 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d5517ba-a17f-482e-bea0-533e24a259a9-dns-svc\") pod \"1d5517ba-a17f-482e-bea0-533e24a259a9\" (UID: \"1d5517ba-a17f-482e-bea0-533e24a259a9\") " Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.211972 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5517ba-a17f-482e-bea0-533e24a259a9-config\") pod \"1d5517ba-a17f-482e-bea0-533e24a259a9\" (UID: \"1d5517ba-a17f-482e-bea0-533e24a259a9\") " Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.212001 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fk6h\" (UniqueName: \"kubernetes.io/projected/1d5517ba-a17f-482e-bea0-533e24a259a9-kube-api-access-8fk6h\") pod \"1d5517ba-a17f-482e-bea0-533e24a259a9\" (UID: \"1d5517ba-a17f-482e-bea0-533e24a259a9\") " Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.213189 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5517ba-a17f-482e-bea0-533e24a259a9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1d5517ba-a17f-482e-bea0-533e24a259a9" (UID: "1d5517ba-a17f-482e-bea0-533e24a259a9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.213533 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5517ba-a17f-482e-bea0-533e24a259a9-config" (OuterVolumeSpecName: "config") pod "1d5517ba-a17f-482e-bea0-533e24a259a9" (UID: "1d5517ba-a17f-482e-bea0-533e24a259a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.217454 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.220020 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.220249 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d5517ba-a17f-482e-bea0-533e24a259a9-kube-api-access-8fk6h" (OuterVolumeSpecName: "kube-api-access-8fk6h") pod "1d5517ba-a17f-482e-bea0-533e24a259a9" (UID: "1d5517ba-a17f-482e-bea0-533e24a259a9"). InnerVolumeSpecName "kube-api-access-8fk6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.225004 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.225201 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.225530 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.225654 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.225758 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-pb86g" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.232512 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.240050 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8dtsl"] Nov 25 16:14:17 crc kubenswrapper[4743]: W1125 16:14:17.241835 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7750901a_7566_4d94_8cb5_5aff66e22116.slice/crio-17fa844d38056fe27289ca1ea29c658e850071b5656b1945b3f6aea61dd9003a WatchSource:0}: Error finding container 17fa844d38056fe27289ca1ea29c658e850071b5656b1945b3f6aea61dd9003a: Status 404 returned error can't find the container with id 17fa844d38056fe27289ca1ea29c658e850071b5656b1945b3f6aea61dd9003a Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.263007 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.277332 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.287283 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-c552s" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.313440 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f91b710-af26-4e54-a350-1ff3aaec211b-config\") pod \"0f91b710-af26-4e54-a350-1ff3aaec211b\" (UID: \"0f91b710-af26-4e54-a350-1ff3aaec211b\") " Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.313545 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkst8\" (UniqueName: \"kubernetes.io/projected/0f91b710-af26-4e54-a350-1ff3aaec211b-kube-api-access-wkst8\") pod \"0f91b710-af26-4e54-a350-1ff3aaec211b\" (UID: \"0f91b710-af26-4e54-a350-1ff3aaec211b\") " Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.313689 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.313725 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.313754 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-config\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.313784 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.313905 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qffkm\" (UniqueName: \"kubernetes.io/projected/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-kube-api-access-qffkm\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.313943 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.313982 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f91b710-af26-4e54-a350-1ff3aaec211b-config" (OuterVolumeSpecName: "config") pod "0f91b710-af26-4e54-a350-1ff3aaec211b" (UID: "0f91b710-af26-4e54-a350-1ff3aaec211b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.314005 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.314039 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.314178 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f91b710-af26-4e54-a350-1ff3aaec211b-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.314189 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d5517ba-a17f-482e-bea0-533e24a259a9-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.314200 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5517ba-a17f-482e-bea0-533e24a259a9-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.314209 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fk6h\" (UniqueName: \"kubernetes.io/projected/1d5517ba-a17f-482e-bea0-533e24a259a9-kube-api-access-8fk6h\") on node \"crc\" DevicePath \"\"" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.316797 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f91b710-af26-4e54-a350-1ff3aaec211b-kube-api-access-wkst8" (OuterVolumeSpecName: "kube-api-access-wkst8") pod "0f91b710-af26-4e54-a350-1ff3aaec211b" (UID: "0f91b710-af26-4e54-a350-1ff3aaec211b"). InnerVolumeSpecName "kube-api-access-wkst8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.415098 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qffkm\" (UniqueName: \"kubernetes.io/projected/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-kube-api-access-qffkm\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.415139 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.415165 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.415188 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.415221 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.415295 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.415321 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-config\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.415349 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.415394 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkst8\" (UniqueName: \"kubernetes.io/projected/0f91b710-af26-4e54-a350-1ff3aaec211b-kube-api-access-wkst8\") on node \"crc\" DevicePath \"\"" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.416128 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.416939 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.417226 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.417543 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-config\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.420578 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.422057 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.422862 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.449725 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qffkm\" (UniqueName: \"kubernetes.io/projected/3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1-kube-api-access-qffkm\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.460668 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.463299 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1\") " pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.464566 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.467746 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.467860 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-xf48l" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.469115 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.472355 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.480159 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.515967 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c6f500e-afe2-4505-8a75-d68f109b80dc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.516009 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c6f500e-afe2-4505-8a75-d68f109b80dc-config\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.516055 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6f500e-afe2-4505-8a75-d68f109b80dc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.516072 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.516094 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c6f500e-afe2-4505-8a75-d68f109b80dc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.516117 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1c6f500e-afe2-4505-8a75-d68f109b80dc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.516155 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c6f500e-afe2-4505-8a75-d68f109b80dc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.516174 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4tt6\" (UniqueName: \"kubernetes.io/projected/1c6f500e-afe2-4505-8a75-d68f109b80dc-kube-api-access-z4tt6\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.545484 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.617329 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1c6f500e-afe2-4505-8a75-d68f109b80dc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.617396 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c6f500e-afe2-4505-8a75-d68f109b80dc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.617426 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4tt6\" (UniqueName: \"kubernetes.io/projected/1c6f500e-afe2-4505-8a75-d68f109b80dc-kube-api-access-z4tt6\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.617527 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c6f500e-afe2-4505-8a75-d68f109b80dc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.617553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c6f500e-afe2-4505-8a75-d68f109b80dc-config\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.617618 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6f500e-afe2-4505-8a75-d68f109b80dc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.617641 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.617664 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c6f500e-afe2-4505-8a75-d68f109b80dc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.617971 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.618326 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1c6f500e-afe2-4505-8a75-d68f109b80dc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.619202 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c6f500e-afe2-4505-8a75-d68f109b80dc-config\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.619785 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c6f500e-afe2-4505-8a75-d68f109b80dc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.622254 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c6f500e-afe2-4505-8a75-d68f109b80dc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.636667 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6f500e-afe2-4505-8a75-d68f109b80dc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.639327 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c6f500e-afe2-4505-8a75-d68f109b80dc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.648927 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.658338 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.658417 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4tt6\" (UniqueName: \"kubernetes.io/projected/1c6f500e-afe2-4505-8a75-d68f109b80dc-kube-api-access-z4tt6\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.663961 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 16:14:17 crc kubenswrapper[4743]: W1125 16:14:17.669856 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30545138_1305_45e8_9225_386065312213.slice/crio-3608bae3698188d90510f57a1935ce3b2777dcfd1c5f3c6ac34a66d00d0af3b8 WatchSource:0}: Error finding container 3608bae3698188d90510f57a1935ce3b2777dcfd1c5f3c6ac34a66d00d0af3b8: Status 404 returned error can't find the container with id 3608bae3698188d90510f57a1935ce3b2777dcfd1c5f3c6ac34a66d00d0af3b8 Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.675771 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.678818 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1c6f500e-afe2-4505-8a75-d68f109b80dc\") " pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: W1125 16:14:17.684958 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32600c5f_46d2_441f_bda1_2ca9e0c35f35.slice/crio-b2c07c46b4e775ca1fd0ee1f9e2d73a6f089dede606b1ddfd7232bdfe47a0125 WatchSource:0}: Error finding container b2c07c46b4e775ca1fd0ee1f9e2d73a6f089dede606b1ddfd7232bdfe47a0125: Status 404 returned error can't find the container with id b2c07c46b4e775ca1fd0ee1f9e2d73a6f089dede606b1ddfd7232bdfe47a0125 Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.700627 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" event={"ID":"1d5517ba-a17f-482e-bea0-533e24a259a9","Type":"ContainerDied","Data":"20cc117df6af3fa8d96f1645072cf7b5578f843fd412b69df76cd3435f451e7b"} Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.700730 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sgdfb" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.706316 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"30545138-1305-45e8-9225-386065312213","Type":"ContainerStarted","Data":"3608bae3698188d90510f57a1935ce3b2777dcfd1c5f3c6ac34a66d00d0af3b8"} Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.711908 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d","Type":"ContainerStarted","Data":"82a61d1b557084be627e1e2b63a9d8f5635087efe896903749d1f4916be7f9a0"} Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.715514 4743 generic.go:334] "Generic (PLEG): container finished" podID="db79aaa9-4b33-4db5-991f-9a4b3aee84ae" containerID="d1bbc4b02ffca1c683e7ac9994288b910d7a5750bac721b4fd1b54711d7a1e6e" exitCode=0 Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.715629 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" event={"ID":"db79aaa9-4b33-4db5-991f-9a4b3aee84ae","Type":"ContainerDied","Data":"d1bbc4b02ffca1c683e7ac9994288b910d7a5750bac721b4fd1b54711d7a1e6e"} Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.718758 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8dtsl" event={"ID":"7750901a-7566-4d94-8cb5-5aff66e22116","Type":"ContainerStarted","Data":"17fa844d38056fe27289ca1ea29c658e850071b5656b1945b3f6aea61dd9003a"} Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.721728 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e54e0104-81dc-49fc-9233-135bf00032be","Type":"ContainerStarted","Data":"16f0be286876ffe45de302e6a77e58e5587a2614c4da94119293712ae25fd013"} Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.723898 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"99b737b1-8d17-4abc-a898-1ceedff80421","Type":"ContainerStarted","Data":"59a57cdd3a977ad2ed968a380c17af996a8977a8a484ea7ed1f32142b594ffc3"} Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.725734 4743 generic.go:334] "Generic (PLEG): container finished" podID="d1d629a2-dc68-4402-b4b9-7d9da6214e50" containerID="901d3be8795e04661205fb1aa2266da8a1e9e4439bac76741fbace47da5d1025" exitCode=0 Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.725788 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" event={"ID":"d1d629a2-dc68-4402-b4b9-7d9da6214e50","Type":"ContainerDied","Data":"901d3be8795e04661205fb1aa2266da8a1e9e4439bac76741fbace47da5d1025"} Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.741280 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8337526c-dedb-4e1a-b73e-a9c37c6e6927","Type":"ContainerStarted","Data":"0249f6ac0c8a1da11804294780798c2fe79821f4a39d8b7198793565363dba36"} Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.743096 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-c552s" event={"ID":"0f91b710-af26-4e54-a350-1ff3aaec211b","Type":"ContainerDied","Data":"9b8bfbef9e04b1dfc06e079268cd6c8ce5fd634cb37db0116e66d434aa1e7355"} Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.743137 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-c552s" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.744464 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"32600c5f-46d2-441f-bda1-2ca9e0c35f35","Type":"ContainerStarted","Data":"b2c07c46b4e775ca1fd0ee1f9e2d73a6f089dede606b1ddfd7232bdfe47a0125"} Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.814198 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.817093 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sgdfb"] Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.817126 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sgdfb"] Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.852980 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c552s"] Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.863856 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c552s"] Nov 25 16:14:17 crc kubenswrapper[4743]: I1125 16:14:17.878961 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-lmnwx"] Nov 25 16:14:17 crc kubenswrapper[4743]: E1125 16:14:17.949543 4743 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Nov 25 16:14:17 crc kubenswrapper[4743]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/d1d629a2-dc68-4402-b4b9-7d9da6214e50/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 25 16:14:17 crc kubenswrapper[4743]: > podSandboxID="d227781455495179ac10f8d5526753a8902b41cf8766b6d366845c892e7e2ead" Nov 25 16:14:17 crc kubenswrapper[4743]: E1125 16:14:17.950052 4743 kuberuntime_manager.go:1274] "Unhandled Error" err=< Nov 25 16:14:17 crc kubenswrapper[4743]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fhp6p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-s8tgf_openstack(d1d629a2-dc68-4402-b4b9-7d9da6214e50): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/d1d629a2-dc68-4402-b4b9-7d9da6214e50/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 25 16:14:17 crc kubenswrapper[4743]: > logger="UnhandledError" Nov 25 16:14:17 crc kubenswrapper[4743]: E1125 16:14:17.951131 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/d1d629a2-dc68-4402-b4b9-7d9da6214e50/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" podUID="d1d629a2-dc68-4402-b4b9-7d9da6214e50" Nov 25 16:14:18 crc kubenswrapper[4743]: I1125 16:14:18.083317 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 25 16:14:18 crc kubenswrapper[4743]: W1125 16:14:18.098936 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ea5ee8a_d85d_40b3_ad4c_b89f29a7fdf1.slice/crio-cd8087831acfcd7f6ea0fbf6850b24a492a8d743edc41454285630ec7f65f640 WatchSource:0}: Error finding container cd8087831acfcd7f6ea0fbf6850b24a492a8d743edc41454285630ec7f65f640: Status 404 returned error can't find the container with id cd8087831acfcd7f6ea0fbf6850b24a492a8d743edc41454285630ec7f65f640 Nov 25 16:14:18 crc kubenswrapper[4743]: I1125 16:14:18.347054 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 25 16:14:18 crc kubenswrapper[4743]: W1125 16:14:18.349159 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c6f500e_afe2_4505_8a75_d68f109b80dc.slice/crio-06c20e0420dab2b9cd056476f8c7c9ad2fc4f062d7ffbd375102bd2acf4e2ec1 WatchSource:0}: Error finding container 06c20e0420dab2b9cd056476f8c7c9ad2fc4f062d7ffbd375102bd2acf4e2ec1: Status 404 returned error can't find the container with id 06c20e0420dab2b9cd056476f8c7c9ad2fc4f062d7ffbd375102bd2acf4e2ec1 Nov 25 16:14:18 crc kubenswrapper[4743]: I1125 16:14:18.753766 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" event={"ID":"db79aaa9-4b33-4db5-991f-9a4b3aee84ae","Type":"ContainerStarted","Data":"3818a0a0738abe4adae403a3161611c5df4158ff3518ae3e343826dc49bd2be2"} Nov 25 16:14:18 crc kubenswrapper[4743]: I1125 16:14:18.754137 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" Nov 25 16:14:18 crc kubenswrapper[4743]: I1125 16:14:18.758636 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1","Type":"ContainerStarted","Data":"cd8087831acfcd7f6ea0fbf6850b24a492a8d743edc41454285630ec7f65f640"} Nov 25 16:14:18 crc kubenswrapper[4743]: I1125 16:14:18.759781 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1c6f500e-afe2-4505-8a75-d68f109b80dc","Type":"ContainerStarted","Data":"06c20e0420dab2b9cd056476f8c7c9ad2fc4f062d7ffbd375102bd2acf4e2ec1"} Nov 25 16:14:18 crc kubenswrapper[4743]: I1125 16:14:18.761780 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lmnwx" event={"ID":"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc","Type":"ContainerStarted","Data":"8326f5f8149e6a880f6ab38a34a293eff5a449f0e156dd1bbf36708535765ebb"} Nov 25 16:14:18 crc kubenswrapper[4743]: I1125 16:14:18.775109 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" podStartSLOduration=3.380599917 podStartE2EDuration="15.775084101s" podCreationTimestamp="2025-11-25 16:14:03 +0000 UTC" firstStartedPulling="2025-11-25 16:14:04.202460952 +0000 UTC m=+923.324300511" lastFinishedPulling="2025-11-25 16:14:16.596945136 +0000 UTC m=+935.718784695" observedRunningTime="2025-11-25 16:14:18.774692648 +0000 UTC m=+937.896532197" watchObservedRunningTime="2025-11-25 16:14:18.775084101 +0000 UTC m=+937.896923650" Nov 25 16:14:19 crc kubenswrapper[4743]: I1125 16:14:19.821277 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f91b710-af26-4e54-a350-1ff3aaec211b" path="/var/lib/kubelet/pods/0f91b710-af26-4e54-a350-1ff3aaec211b/volumes" Nov 25 16:14:19 crc kubenswrapper[4743]: I1125 16:14:19.822143 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d5517ba-a17f-482e-bea0-533e24a259a9" path="/var/lib/kubelet/pods/1d5517ba-a17f-482e-bea0-533e24a259a9/volumes" Nov 25 16:14:19 crc kubenswrapper[4743]: I1125 16:14:19.822532 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" event={"ID":"d1d629a2-dc68-4402-b4b9-7d9da6214e50","Type":"ContainerStarted","Data":"2a00f98310e27a5c06cb131c0351e11dc4b6e48bc0581770c577a6f99f333999"} Nov 25 16:14:19 crc kubenswrapper[4743]: I1125 16:14:19.822762 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" Nov 25 16:14:19 crc kubenswrapper[4743]: I1125 16:14:19.849439 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" podStartSLOduration=5.119475168 podStartE2EDuration="17.849419182s" podCreationTimestamp="2025-11-25 16:14:02 +0000 UTC" firstStartedPulling="2025-11-25 16:14:03.877405477 +0000 UTC m=+922.999245026" lastFinishedPulling="2025-11-25 16:14:16.607349491 +0000 UTC m=+935.729189040" observedRunningTime="2025-11-25 16:14:19.847018277 +0000 UTC m=+938.968857836" watchObservedRunningTime="2025-11-25 16:14:19.849419182 +0000 UTC m=+938.971258731" Nov 25 16:14:20 crc kubenswrapper[4743]: I1125 16:14:20.077513 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:14:20 crc kubenswrapper[4743]: I1125 16:14:20.077569 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:14:20 crc kubenswrapper[4743]: I1125 16:14:20.077641 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 16:14:20 crc kubenswrapper[4743]: I1125 16:14:20.078231 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3891bef80e07425c4fd47953c65e853bb10f31ef01a35da0d32440f2ed3b5e2c"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:14:20 crc kubenswrapper[4743]: I1125 16:14:20.078280 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://3891bef80e07425c4fd47953c65e853bb10f31ef01a35da0d32440f2ed3b5e2c" gracePeriod=600 Nov 25 16:14:20 crc kubenswrapper[4743]: I1125 16:14:20.837195 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="3891bef80e07425c4fd47953c65e853bb10f31ef01a35da0d32440f2ed3b5e2c" exitCode=0 Nov 25 16:14:20 crc kubenswrapper[4743]: I1125 16:14:20.837245 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"3891bef80e07425c4fd47953c65e853bb10f31ef01a35da0d32440f2ed3b5e2c"} Nov 25 16:14:20 crc kubenswrapper[4743]: I1125 16:14:20.837673 4743 scope.go:117] "RemoveContainer" containerID="cb34634469d7134691d6897c55890dcb1975da56bf10ed444c930c92d7b2c025" Nov 25 16:14:23 crc kubenswrapper[4743]: I1125 16:14:23.268155 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" Nov 25 16:14:23 crc kubenswrapper[4743]: I1125 16:14:23.701818 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" Nov 25 16:14:23 crc kubenswrapper[4743]: I1125 16:14:23.807549 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-s8tgf"] Nov 25 16:14:23 crc kubenswrapper[4743]: I1125 16:14:23.876348 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" podUID="d1d629a2-dc68-4402-b4b9-7d9da6214e50" containerName="dnsmasq-dns" containerID="cri-o://2a00f98310e27a5c06cb131c0351e11dc4b6e48bc0581770c577a6f99f333999" gracePeriod=10 Nov 25 16:14:23 crc kubenswrapper[4743]: I1125 16:14:23.877310 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"22876c3200d1bd282f05d310d56d80b6ce637f5b7335a83f68f3eb1b6ac3ce7a"} Nov 25 16:14:24 crc kubenswrapper[4743]: I1125 16:14:24.899241 4743 generic.go:334] "Generic (PLEG): container finished" podID="d1d629a2-dc68-4402-b4b9-7d9da6214e50" containerID="2a00f98310e27a5c06cb131c0351e11dc4b6e48bc0581770c577a6f99f333999" exitCode=0 Nov 25 16:14:24 crc kubenswrapper[4743]: I1125 16:14:24.899444 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" event={"ID":"d1d629a2-dc68-4402-b4b9-7d9da6214e50","Type":"ContainerDied","Data":"2a00f98310e27a5c06cb131c0351e11dc4b6e48bc0581770c577a6f99f333999"} Nov 25 16:14:27 crc kubenswrapper[4743]: I1125 16:14:27.696408 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" Nov 25 16:14:27 crc kubenswrapper[4743]: I1125 16:14:27.800404 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1d629a2-dc68-4402-b4b9-7d9da6214e50-dns-svc\") pod \"d1d629a2-dc68-4402-b4b9-7d9da6214e50\" (UID: \"d1d629a2-dc68-4402-b4b9-7d9da6214e50\") " Nov 25 16:14:27 crc kubenswrapper[4743]: I1125 16:14:27.800549 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d629a2-dc68-4402-b4b9-7d9da6214e50-config\") pod \"d1d629a2-dc68-4402-b4b9-7d9da6214e50\" (UID: \"d1d629a2-dc68-4402-b4b9-7d9da6214e50\") " Nov 25 16:14:27 crc kubenswrapper[4743]: I1125 16:14:27.800624 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhp6p\" (UniqueName: \"kubernetes.io/projected/d1d629a2-dc68-4402-b4b9-7d9da6214e50-kube-api-access-fhp6p\") pod \"d1d629a2-dc68-4402-b4b9-7d9da6214e50\" (UID: \"d1d629a2-dc68-4402-b4b9-7d9da6214e50\") " Nov 25 16:14:27 crc kubenswrapper[4743]: I1125 16:14:27.804979 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1d629a2-dc68-4402-b4b9-7d9da6214e50-kube-api-access-fhp6p" (OuterVolumeSpecName: "kube-api-access-fhp6p") pod "d1d629a2-dc68-4402-b4b9-7d9da6214e50" (UID: "d1d629a2-dc68-4402-b4b9-7d9da6214e50"). InnerVolumeSpecName "kube-api-access-fhp6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:14:27 crc kubenswrapper[4743]: I1125 16:14:27.839753 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1d629a2-dc68-4402-b4b9-7d9da6214e50-config" (OuterVolumeSpecName: "config") pod "d1d629a2-dc68-4402-b4b9-7d9da6214e50" (UID: "d1d629a2-dc68-4402-b4b9-7d9da6214e50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:14:27 crc kubenswrapper[4743]: I1125 16:14:27.845708 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1d629a2-dc68-4402-b4b9-7d9da6214e50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d1d629a2-dc68-4402-b4b9-7d9da6214e50" (UID: "d1d629a2-dc68-4402-b4b9-7d9da6214e50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:14:27 crc kubenswrapper[4743]: I1125 16:14:27.902017 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhp6p\" (UniqueName: \"kubernetes.io/projected/d1d629a2-dc68-4402-b4b9-7d9da6214e50-kube-api-access-fhp6p\") on node \"crc\" DevicePath \"\"" Nov 25 16:14:27 crc kubenswrapper[4743]: I1125 16:14:27.902051 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1d629a2-dc68-4402-b4b9-7d9da6214e50-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:14:27 crc kubenswrapper[4743]: I1125 16:14:27.902060 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d629a2-dc68-4402-b4b9-7d9da6214e50-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:14:27 crc kubenswrapper[4743]: I1125 16:14:27.923222 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" event={"ID":"d1d629a2-dc68-4402-b4b9-7d9da6214e50","Type":"ContainerDied","Data":"d227781455495179ac10f8d5526753a8902b41cf8766b6d366845c892e7e2ead"} Nov 25 16:14:27 crc kubenswrapper[4743]: I1125 16:14:27.923294 4743 scope.go:117] "RemoveContainer" containerID="2a00f98310e27a5c06cb131c0351e11dc4b6e48bc0581770c577a6f99f333999" Nov 25 16:14:27 crc kubenswrapper[4743]: I1125 16:14:27.923320 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-s8tgf" Nov 25 16:14:27 crc kubenswrapper[4743]: I1125 16:14:27.961674 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-s8tgf"] Nov 25 16:14:27 crc kubenswrapper[4743]: I1125 16:14:27.969050 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-s8tgf"] Nov 25 16:14:28 crc kubenswrapper[4743]: I1125 16:14:28.181194 4743 scope.go:117] "RemoveContainer" containerID="901d3be8795e04661205fb1aa2266da8a1e9e4439bac76741fbace47da5d1025" Nov 25 16:14:28 crc kubenswrapper[4743]: I1125 16:14:28.943046 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"30545138-1305-45e8-9225-386065312213","Type":"ContainerStarted","Data":"9c681750f50c8f1c99346114be715d37cd2c9fbe8212c66fdd40c9261033c9dc"} Nov 25 16:14:28 crc kubenswrapper[4743]: I1125 16:14:28.943777 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 25 16:14:28 crc kubenswrapper[4743]: I1125 16:14:28.946538 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8337526c-dedb-4e1a-b73e-a9c37c6e6927","Type":"ContainerStarted","Data":"e810d6bbf087dfe7d94378e11b5aebf04704edbcafe9dd9e415adda896bf5952"} Nov 25 16:14:28 crc kubenswrapper[4743]: I1125 16:14:28.946698 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 25 16:14:28 crc kubenswrapper[4743]: I1125 16:14:28.949920 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d","Type":"ContainerStarted","Data":"cb7f4a15a88810db05e4e316bf4ac34f0936097baa047f77387f1e4f033615b4"} Nov 25 16:14:28 crc kubenswrapper[4743]: I1125 16:14:28.952332 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1c6f500e-afe2-4505-8a75-d68f109b80dc","Type":"ContainerStarted","Data":"67abf6cb0e1c7d0b18ab8ace2a24bde8ea6deb213d65dcd932e4b6a25b2f6e51"} Nov 25 16:14:28 crc kubenswrapper[4743]: I1125 16:14:28.954293 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e54e0104-81dc-49fc-9233-135bf00032be","Type":"ContainerStarted","Data":"c9a7fac6e3b8e28618db28b6989a7f4e69a7770a715a42277b3af0fb31f7ed9a"} Nov 25 16:14:29 crc kubenswrapper[4743]: I1125 16:14:29.003295 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.015191843 podStartE2EDuration="22.00327545s" podCreationTimestamp="2025-11-25 16:14:07 +0000 UTC" firstStartedPulling="2025-11-25 16:14:17.679263916 +0000 UTC m=+936.801103455" lastFinishedPulling="2025-11-25 16:14:27.667347503 +0000 UTC m=+946.789187062" observedRunningTime="2025-11-25 16:14:28.964860927 +0000 UTC m=+948.086700506" watchObservedRunningTime="2025-11-25 16:14:29.00327545 +0000 UTC m=+948.125114999" Nov 25 16:14:29 crc kubenswrapper[4743]: I1125 16:14:29.027494 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=7.974821177 podStartE2EDuration="19.027474209s" podCreationTimestamp="2025-11-25 16:14:10 +0000 UTC" firstStartedPulling="2025-11-25 16:14:17.674173497 +0000 UTC m=+936.796013046" lastFinishedPulling="2025-11-25 16:14:28.726826529 +0000 UTC m=+947.848666078" observedRunningTime="2025-11-25 16:14:29.014009186 +0000 UTC m=+948.135848735" watchObservedRunningTime="2025-11-25 16:14:29.027474209 +0000 UTC m=+948.149313758" Nov 25 16:14:29 crc kubenswrapper[4743]: I1125 16:14:29.787056 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1d629a2-dc68-4402-b4b9-7d9da6214e50" path="/var/lib/kubelet/pods/d1d629a2-dc68-4402-b4b9-7d9da6214e50/volumes" Nov 25 16:14:29 crc kubenswrapper[4743]: I1125 16:14:29.969660 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"32600c5f-46d2-441f-bda1-2ca9e0c35f35","Type":"ContainerStarted","Data":"e571dd70fc87a20ea9646cd7b5a93fac691cefd57db0cf105731fd3bd0dd22d9"} Nov 25 16:14:29 crc kubenswrapper[4743]: I1125 16:14:29.975133 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"99b737b1-8d17-4abc-a898-1ceedff80421","Type":"ContainerStarted","Data":"45011f800605d00f351de8a2e8909dedde2295779fa6c1754dd8036fcd5511e6"} Nov 25 16:14:29 crc kubenswrapper[4743]: I1125 16:14:29.978082 4743 generic.go:334] "Generic (PLEG): container finished" podID="ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc" containerID="33b4085c9db3004a0c8ece9335920cb0de3ce8625712c7d50d14f794525cbc45" exitCode=0 Nov 25 16:14:29 crc kubenswrapper[4743]: I1125 16:14:29.978177 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lmnwx" event={"ID":"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc","Type":"ContainerDied","Data":"33b4085c9db3004a0c8ece9335920cb0de3ce8625712c7d50d14f794525cbc45"} Nov 25 16:14:29 crc kubenswrapper[4743]: I1125 16:14:29.979786 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1","Type":"ContainerStarted","Data":"c6d9f995b419342364c1b1ca08ee1a68ac1fb7ee77b186697c5b57dce9b1bf5d"} Nov 25 16:14:29 crc kubenswrapper[4743]: I1125 16:14:29.982255 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8dtsl" event={"ID":"7750901a-7566-4d94-8cb5-5aff66e22116","Type":"ContainerStarted","Data":"0dfaf6a6a8d349f9c68f045cb3171bbf14124cd14d22f67593c23c601d485aca"} Nov 25 16:14:30 crc kubenswrapper[4743]: I1125 16:14:30.025612 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-8dtsl" podStartSLOduration=6.656421755 podStartE2EDuration="17.025578151s" podCreationTimestamp="2025-11-25 16:14:13 +0000 UTC" firstStartedPulling="2025-11-25 16:14:17.243303647 +0000 UTC m=+936.365143196" lastFinishedPulling="2025-11-25 16:14:27.612460033 +0000 UTC m=+946.734299592" observedRunningTime="2025-11-25 16:14:30.018099497 +0000 UTC m=+949.139939046" watchObservedRunningTime="2025-11-25 16:14:30.025578151 +0000 UTC m=+949.147417690" Nov 25 16:14:30 crc kubenswrapper[4743]: I1125 16:14:30.993781 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lmnwx" event={"ID":"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc","Type":"ContainerStarted","Data":"80a2ad3b85b1e6703e9a970f31dd9f5f9dc9b328c8b3169a8f0a6b4832bb381a"} Nov 25 16:14:30 crc kubenswrapper[4743]: I1125 16:14:30.994312 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-8dtsl" Nov 25 16:14:30 crc kubenswrapper[4743]: I1125 16:14:30.994324 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:30 crc kubenswrapper[4743]: I1125 16:14:30.994334 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lmnwx" event={"ID":"ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc","Type":"ContainerStarted","Data":"e235556e7d9cd8498bc3e6531cf7e9ca6e0fbf8e9a0c42d590451672fd6cb8c2"} Nov 25 16:14:30 crc kubenswrapper[4743]: I1125 16:14:30.994345 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:14:31 crc kubenswrapper[4743]: I1125 16:14:31.012545 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-lmnwx" podStartSLOduration=8.218106867 podStartE2EDuration="18.012522585s" podCreationTimestamp="2025-11-25 16:14:13 +0000 UTC" firstStartedPulling="2025-11-25 16:14:17.871246632 +0000 UTC m=+936.993086181" lastFinishedPulling="2025-11-25 16:14:27.66566235 +0000 UTC m=+946.787501899" observedRunningTime="2025-11-25 16:14:31.010460569 +0000 UTC m=+950.132300148" watchObservedRunningTime="2025-11-25 16:14:31.012522585 +0000 UTC m=+950.134362134" Nov 25 16:14:37 crc kubenswrapper[4743]: I1125 16:14:37.675316 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 25 16:14:40 crc kubenswrapper[4743]: I1125 16:14:40.056881 4743 generic.go:334] "Generic (PLEG): container finished" podID="e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d" containerID="cb7f4a15a88810db05e4e316bf4ac34f0936097baa047f77387f1e4f033615b4" exitCode=0 Nov 25 16:14:40 crc kubenswrapper[4743]: I1125 16:14:40.057475 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d","Type":"ContainerDied","Data":"cb7f4a15a88810db05e4e316bf4ac34f0936097baa047f77387f1e4f033615b4"} Nov 25 16:14:40 crc kubenswrapper[4743]: I1125 16:14:40.841240 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 25 16:14:40 crc kubenswrapper[4743]: I1125 16:14:40.911945 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9xzkx"] Nov 25 16:14:40 crc kubenswrapper[4743]: E1125 16:14:40.912323 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1d629a2-dc68-4402-b4b9-7d9da6214e50" containerName="dnsmasq-dns" Nov 25 16:14:40 crc kubenswrapper[4743]: I1125 16:14:40.912345 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d629a2-dc68-4402-b4b9-7d9da6214e50" containerName="dnsmasq-dns" Nov 25 16:14:40 crc kubenswrapper[4743]: E1125 16:14:40.912371 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1d629a2-dc68-4402-b4b9-7d9da6214e50" containerName="init" Nov 25 16:14:40 crc kubenswrapper[4743]: I1125 16:14:40.912382 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d629a2-dc68-4402-b4b9-7d9da6214e50" containerName="init" Nov 25 16:14:40 crc kubenswrapper[4743]: I1125 16:14:40.912609 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1d629a2-dc68-4402-b4b9-7d9da6214e50" containerName="dnsmasq-dns" Nov 25 16:14:40 crc kubenswrapper[4743]: I1125 16:14:40.913514 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" Nov 25 16:14:40 crc kubenswrapper[4743]: I1125 16:14:40.936239 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9xzkx"] Nov 25 16:14:41 crc kubenswrapper[4743]: I1125 16:14:41.033685 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgpkc\" (UniqueName: \"kubernetes.io/projected/5aeef8c3-165f-49a3-a0ac-90a59da51c35-kube-api-access-sgpkc\") pod \"dnsmasq-dns-7cb5889db5-9xzkx\" (UID: \"5aeef8c3-165f-49a3-a0ac-90a59da51c35\") " pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" Nov 25 16:14:41 crc kubenswrapper[4743]: I1125 16:14:41.033756 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aeef8c3-165f-49a3-a0ac-90a59da51c35-config\") pod \"dnsmasq-dns-7cb5889db5-9xzkx\" (UID: \"5aeef8c3-165f-49a3-a0ac-90a59da51c35\") " pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" Nov 25 16:14:41 crc kubenswrapper[4743]: I1125 16:14:41.033853 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aeef8c3-165f-49a3-a0ac-90a59da51c35-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-9xzkx\" (UID: \"5aeef8c3-165f-49a3-a0ac-90a59da51c35\") " pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" Nov 25 16:14:41 crc kubenswrapper[4743]: I1125 16:14:41.068806 4743 generic.go:334] "Generic (PLEG): container finished" podID="e54e0104-81dc-49fc-9233-135bf00032be" containerID="c9a7fac6e3b8e28618db28b6989a7f4e69a7770a715a42277b3af0fb31f7ed9a" exitCode=0 Nov 25 16:14:41 crc kubenswrapper[4743]: I1125 16:14:41.068860 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e54e0104-81dc-49fc-9233-135bf00032be","Type":"ContainerDied","Data":"c9a7fac6e3b8e28618db28b6989a7f4e69a7770a715a42277b3af0fb31f7ed9a"} Nov 25 16:14:41 crc kubenswrapper[4743]: I1125 16:14:41.135676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aeef8c3-165f-49a3-a0ac-90a59da51c35-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-9xzkx\" (UID: \"5aeef8c3-165f-49a3-a0ac-90a59da51c35\") " pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" Nov 25 16:14:41 crc kubenswrapper[4743]: I1125 16:14:41.135792 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgpkc\" (UniqueName: \"kubernetes.io/projected/5aeef8c3-165f-49a3-a0ac-90a59da51c35-kube-api-access-sgpkc\") pod \"dnsmasq-dns-7cb5889db5-9xzkx\" (UID: \"5aeef8c3-165f-49a3-a0ac-90a59da51c35\") " pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" Nov 25 16:14:41 crc kubenswrapper[4743]: I1125 16:14:41.135865 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aeef8c3-165f-49a3-a0ac-90a59da51c35-config\") pod \"dnsmasq-dns-7cb5889db5-9xzkx\" (UID: \"5aeef8c3-165f-49a3-a0ac-90a59da51c35\") " pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" Nov 25 16:14:41 crc kubenswrapper[4743]: I1125 16:14:41.136759 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aeef8c3-165f-49a3-a0ac-90a59da51c35-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-9xzkx\" (UID: \"5aeef8c3-165f-49a3-a0ac-90a59da51c35\") " pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" Nov 25 16:14:41 crc kubenswrapper[4743]: I1125 16:14:41.136831 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aeef8c3-165f-49a3-a0ac-90a59da51c35-config\") pod \"dnsmasq-dns-7cb5889db5-9xzkx\" (UID: \"5aeef8c3-165f-49a3-a0ac-90a59da51c35\") " pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" Nov 25 16:14:41 crc kubenswrapper[4743]: I1125 16:14:41.157466 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgpkc\" (UniqueName: \"kubernetes.io/projected/5aeef8c3-165f-49a3-a0ac-90a59da51c35-kube-api-access-sgpkc\") pod \"dnsmasq-dns-7cb5889db5-9xzkx\" (UID: \"5aeef8c3-165f-49a3-a0ac-90a59da51c35\") " pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" Nov 25 16:14:41 crc kubenswrapper[4743]: I1125 16:14:41.246997 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.115457 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.122234 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.124783 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.125096 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.125667 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-w42k6" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.125739 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.141990 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.152458 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw4rp\" (UniqueName: \"kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-kube-api-access-qw4rp\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.152532 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.152566 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9ae66928-3c05-4597-98a6-f663e9df7cff-lock\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.152633 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9ae66928-3c05-4597-98a6-f663e9df7cff-cache\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.152658 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.253815 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.253856 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9ae66928-3c05-4597-98a6-f663e9df7cff-lock\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.253905 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9ae66928-3c05-4597-98a6-f663e9df7cff-cache\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.253925 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.254042 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw4rp\" (UniqueName: \"kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-kube-api-access-qw4rp\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: E1125 16:14:42.254519 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 25 16:14:42 crc kubenswrapper[4743]: E1125 16:14:42.254542 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 25 16:14:42 crc kubenswrapper[4743]: E1125 16:14:42.254604 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift podName:9ae66928-3c05-4597-98a6-f663e9df7cff nodeName:}" failed. No retries permitted until 2025-11-25 16:14:42.75456868 +0000 UTC m=+961.876408229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift") pod "swift-storage-0" (UID: "9ae66928-3c05-4597-98a6-f663e9df7cff") : configmap "swift-ring-files" not found Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.254723 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9ae66928-3c05-4597-98a6-f663e9df7cff-cache\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.254934 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.255219 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9ae66928-3c05-4597-98a6-f663e9df7cff-lock\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.305192 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw4rp\" (UniqueName: \"kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-kube-api-access-qw4rp\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.338862 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.628131 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-m926g"] Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.630150 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.645542 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.645638 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.645947 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.656500 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m926g"] Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.663695 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbq8s\" (UniqueName: \"kubernetes.io/projected/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-kube-api-access-nbq8s\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.663786 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-ring-data-devices\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.663842 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-scripts\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.663874 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-combined-ca-bundle\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.663923 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-etc-swift\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.663956 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-swiftconf\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.663988 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-dispersionconf\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.764843 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-scripts\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.764897 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-combined-ca-bundle\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.764946 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-etc-swift\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.764974 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-swiftconf\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.764998 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-dispersionconf\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.765030 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbq8s\" (UniqueName: \"kubernetes.io/projected/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-kube-api-access-nbq8s\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.765064 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.765092 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-ring-data-devices\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: E1125 16:14:42.765499 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 25 16:14:42 crc kubenswrapper[4743]: E1125 16:14:42.765528 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.765568 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-etc-swift\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: E1125 16:14:42.765601 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift podName:9ae66928-3c05-4597-98a6-f663e9df7cff nodeName:}" failed. No retries permitted until 2025-11-25 16:14:43.76556672 +0000 UTC m=+962.887406269 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift") pod "swift-storage-0" (UID: "9ae66928-3c05-4597-98a6-f663e9df7cff") : configmap "swift-ring-files" not found Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.765965 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-scripts\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.765964 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-ring-data-devices\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.771229 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-dispersionconf\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.771409 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-combined-ca-bundle\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.773802 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-swiftconf\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.788838 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbq8s\" (UniqueName: \"kubernetes.io/projected/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-kube-api-access-nbq8s\") pod \"swift-ring-rebalance-m926g\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:42 crc kubenswrapper[4743]: I1125 16:14:42.956795 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:14:43 crc kubenswrapper[4743]: I1125 16:14:43.779174 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:43 crc kubenswrapper[4743]: E1125 16:14:43.779348 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 25 16:14:43 crc kubenswrapper[4743]: E1125 16:14:43.779383 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 25 16:14:43 crc kubenswrapper[4743]: E1125 16:14:43.779443 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift podName:9ae66928-3c05-4597-98a6-f663e9df7cff nodeName:}" failed. No retries permitted until 2025-11-25 16:14:45.779424966 +0000 UTC m=+964.901264515 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift") pod "swift-storage-0" (UID: "9ae66928-3c05-4597-98a6-f663e9df7cff") : configmap "swift-ring-files" not found Nov 25 16:14:45 crc kubenswrapper[4743]: I1125 16:14:45.811185 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:45 crc kubenswrapper[4743]: E1125 16:14:45.811294 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 25 16:14:45 crc kubenswrapper[4743]: E1125 16:14:45.811573 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 25 16:14:45 crc kubenswrapper[4743]: E1125 16:14:45.811643 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift podName:9ae66928-3c05-4597-98a6-f663e9df7cff nodeName:}" failed. No retries permitted until 2025-11-25 16:14:49.811623289 +0000 UTC m=+968.933462888 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift") pod "swift-storage-0" (UID: "9ae66928-3c05-4597-98a6-f663e9df7cff") : configmap "swift-ring-files" not found Nov 25 16:14:47 crc kubenswrapper[4743]: I1125 16:14:47.122803 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1c6f500e-afe2-4505-8a75-d68f109b80dc","Type":"ContainerStarted","Data":"cf3336b9a05ac87c857e5393f29159c3fe8e2d74d8488d8694cc12bd15efa179"} Nov 25 16:14:47 crc kubenswrapper[4743]: I1125 16:14:47.126143 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e54e0104-81dc-49fc-9233-135bf00032be","Type":"ContainerStarted","Data":"1cda1fafa242c0d1b4163df29a143d0aa5c07f950f820beb2d0c1e3b18913e72"} Nov 25 16:14:47 crc kubenswrapper[4743]: I1125 16:14:47.128492 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1","Type":"ContainerStarted","Data":"40732ff2b360cd2e87906adf6049618af97f0700b5031263c12822e711654218"} Nov 25 16:14:47 crc kubenswrapper[4743]: I1125 16:14:47.130875 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d","Type":"ContainerStarted","Data":"160bda80f1f50aa7d01259253cefc76aba4adb94fcce80cd2ea1144d618882d5"} Nov 25 16:14:47 crc kubenswrapper[4743]: I1125 16:14:47.218112 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9xzkx"] Nov 25 16:14:47 crc kubenswrapper[4743]: W1125 16:14:47.226136 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5aeef8c3_165f_49a3_a0ac_90a59da51c35.slice/crio-ecbd3961b07e5a3522cb37269c2f7ef5fe71b84ea80ded1c5dfdda48ca002f82 WatchSource:0}: Error finding container ecbd3961b07e5a3522cb37269c2f7ef5fe71b84ea80ded1c5dfdda48ca002f82: Status 404 returned error can't find the container with id ecbd3961b07e5a3522cb37269c2f7ef5fe71b84ea80ded1c5dfdda48ca002f82 Nov 25 16:14:47 crc kubenswrapper[4743]: I1125 16:14:47.304117 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m926g"] Nov 25 16:14:47 crc kubenswrapper[4743]: W1125 16:14:47.314240 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5eff179_2afc_4ec2_addc_31c3c36a6fd7.slice/crio-cd71c069f6c31153f84908ebccb319c5f27b31eb7db8e5b6b7aeae9f46dc0af5 WatchSource:0}: Error finding container cd71c069f6c31153f84908ebccb319c5f27b31eb7db8e5b6b7aeae9f46dc0af5: Status 404 returned error can't find the container with id cd71c069f6c31153f84908ebccb319c5f27b31eb7db8e5b6b7aeae9f46dc0af5 Nov 25 16:14:48 crc kubenswrapper[4743]: I1125 16:14:48.139196 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m926g" event={"ID":"f5eff179-2afc-4ec2-addc-31c3c36a6fd7","Type":"ContainerStarted","Data":"cd71c069f6c31153f84908ebccb319c5f27b31eb7db8e5b6b7aeae9f46dc0af5"} Nov 25 16:14:48 crc kubenswrapper[4743]: I1125 16:14:48.140823 4743 generic.go:334] "Generic (PLEG): container finished" podID="5aeef8c3-165f-49a3-a0ac-90a59da51c35" containerID="58958002a4f659ddaec318133239ffb4f7a60f606e3d03490289d80b5465b6fa" exitCode=0 Nov 25 16:14:48 crc kubenswrapper[4743]: I1125 16:14:48.140889 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" event={"ID":"5aeef8c3-165f-49a3-a0ac-90a59da51c35","Type":"ContainerDied","Data":"58958002a4f659ddaec318133239ffb4f7a60f606e3d03490289d80b5465b6fa"} Nov 25 16:14:48 crc kubenswrapper[4743]: I1125 16:14:48.140920 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" event={"ID":"5aeef8c3-165f-49a3-a0ac-90a59da51c35","Type":"ContainerStarted","Data":"ecbd3961b07e5a3522cb37269c2f7ef5fe71b84ea80ded1c5dfdda48ca002f82"} Nov 25 16:14:48 crc kubenswrapper[4743]: I1125 16:14:48.193553 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.792011484 podStartE2EDuration="32.19353005s" podCreationTimestamp="2025-11-25 16:14:16 +0000 UTC" firstStartedPulling="2025-11-25 16:14:18.351311743 +0000 UTC m=+937.473151292" lastFinishedPulling="2025-11-25 16:14:46.752830299 +0000 UTC m=+965.874669858" observedRunningTime="2025-11-25 16:14:48.166898396 +0000 UTC m=+967.288737945" watchObservedRunningTime="2025-11-25 16:14:48.19353005 +0000 UTC m=+967.315369619" Nov 25 16:14:48 crc kubenswrapper[4743]: I1125 16:14:48.219117 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.572874717 podStartE2EDuration="32.219099821s" podCreationTimestamp="2025-11-25 16:14:16 +0000 UTC" firstStartedPulling="2025-11-25 16:14:18.101621189 +0000 UTC m=+937.223460738" lastFinishedPulling="2025-11-25 16:14:46.747846283 +0000 UTC m=+965.869685842" observedRunningTime="2025-11-25 16:14:48.215100066 +0000 UTC m=+967.336939615" watchObservedRunningTime="2025-11-25 16:14:48.219099821 +0000 UTC m=+967.340939370" Nov 25 16:14:48 crc kubenswrapper[4743]: I1125 16:14:48.242029 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=31.841584283 podStartE2EDuration="42.242008389s" podCreationTimestamp="2025-11-25 16:14:06 +0000 UTC" firstStartedPulling="2025-11-25 16:14:17.265383179 +0000 UTC m=+936.387222728" lastFinishedPulling="2025-11-25 16:14:27.665807275 +0000 UTC m=+946.787646834" observedRunningTime="2025-11-25 16:14:48.239126888 +0000 UTC m=+967.360966447" watchObservedRunningTime="2025-11-25 16:14:48.242008389 +0000 UTC m=+967.363847948" Nov 25 16:14:48 crc kubenswrapper[4743]: I1125 16:14:48.266631 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=34.199053973 podStartE2EDuration="44.266606689s" podCreationTimestamp="2025-11-25 16:14:04 +0000 UTC" firstStartedPulling="2025-11-25 16:14:17.669850322 +0000 UTC m=+936.791689871" lastFinishedPulling="2025-11-25 16:14:27.737403038 +0000 UTC m=+946.859242587" observedRunningTime="2025-11-25 16:14:48.26213567 +0000 UTC m=+967.383975229" watchObservedRunningTime="2025-11-25 16:14:48.266606689 +0000 UTC m=+967.388446238" Nov 25 16:14:49 crc kubenswrapper[4743]: I1125 16:14:49.165678 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" event={"ID":"5aeef8c3-165f-49a3-a0ac-90a59da51c35","Type":"ContainerStarted","Data":"e6a7df6fdfa5934b6dc5af5207d161f1df334f329e4783e625db87000e1e727a"} Nov 25 16:14:49 crc kubenswrapper[4743]: I1125 16:14:49.165843 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" Nov 25 16:14:49 crc kubenswrapper[4743]: I1125 16:14:49.877130 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:49 crc kubenswrapper[4743]: E1125 16:14:49.877418 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 25 16:14:49 crc kubenswrapper[4743]: E1125 16:14:49.877617 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 25 16:14:49 crc kubenswrapper[4743]: E1125 16:14:49.877667 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift podName:9ae66928-3c05-4597-98a6-f663e9df7cff nodeName:}" failed. No retries permitted until 2025-11-25 16:14:57.877650827 +0000 UTC m=+976.999490376 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift") pod "swift-storage-0" (UID: "9ae66928-3c05-4597-98a6-f663e9df7cff") : configmap "swift-ring-files" not found Nov 25 16:14:50 crc kubenswrapper[4743]: I1125 16:14:50.545707 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:50 crc kubenswrapper[4743]: I1125 16:14:50.587450 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:50 crc kubenswrapper[4743]: I1125 16:14:50.609082 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" podStartSLOduration=10.609063424 podStartE2EDuration="10.609063424s" podCreationTimestamp="2025-11-25 16:14:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:14:49.193938114 +0000 UTC m=+968.315777673" watchObservedRunningTime="2025-11-25 16:14:50.609063424 +0000 UTC m=+969.730902973" Nov 25 16:14:50 crc kubenswrapper[4743]: I1125 16:14:50.815061 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:50 crc kubenswrapper[4743]: I1125 16:14:50.851520 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.183353 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m926g" event={"ID":"f5eff179-2afc-4ec2-addc-31c3c36a6fd7","Type":"ContainerStarted","Data":"840b5269267debf5ba2619371f678be12d6462f7e5b7102d9c467b13f8a75834"} Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.184236 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.184284 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.211319 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-m926g" podStartSLOduration=5.910370298 podStartE2EDuration="9.211293833s" podCreationTimestamp="2025-11-25 16:14:42 +0000 UTC" firstStartedPulling="2025-11-25 16:14:47.318143682 +0000 UTC m=+966.439983231" lastFinishedPulling="2025-11-25 16:14:50.619067217 +0000 UTC m=+969.740906766" observedRunningTime="2025-11-25 16:14:51.201670161 +0000 UTC m=+970.323509740" watchObservedRunningTime="2025-11-25 16:14:51.211293833 +0000 UTC m=+970.333133382" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.223460 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.228163 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.398379 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9xzkx"] Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.398683 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" podUID="5aeef8c3-165f-49a3-a0ac-90a59da51c35" containerName="dnsmasq-dns" containerID="cri-o://e6a7df6fdfa5934b6dc5af5207d161f1df334f329e4783e625db87000e1e727a" gracePeriod=10 Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.432809 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-96bvs"] Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.434274 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.436881 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.452671 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-96bvs"] Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.504020 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-96bvs\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.504155 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-96bvs\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.504192 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-config\") pod \"dnsmasq-dns-74f6f696b9-96bvs\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.504656 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkvfd\" (UniqueName: \"kubernetes.io/projected/387e3166-8fcf-4910-87ad-422372191cb9-kube-api-access-pkvfd\") pod \"dnsmasq-dns-74f6f696b9-96bvs\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.606086 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkvfd\" (UniqueName: \"kubernetes.io/projected/387e3166-8fcf-4910-87ad-422372191cb9-kube-api-access-pkvfd\") pod \"dnsmasq-dns-74f6f696b9-96bvs\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.606183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-96bvs\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.606234 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-96bvs\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.606256 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-config\") pod \"dnsmasq-dns-74f6f696b9-96bvs\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.607190 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-config\") pod \"dnsmasq-dns-74f6f696b9-96bvs\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.607267 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-96bvs\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.607523 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-96bvs\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.615195 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-znflv"] Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.616202 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.624035 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.638884 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-znflv"] Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.657638 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkvfd\" (UniqueName: \"kubernetes.io/projected/387e3166-8fcf-4910-87ad-422372191cb9-kube-api-access-pkvfd\") pod \"dnsmasq-dns-74f6f696b9-96bvs\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.714665 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5835f976-c6b4-4bd9-9893-70905ce30872-ovn-rundir\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.714719 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5835f976-c6b4-4bd9-9893-70905ce30872-combined-ca-bundle\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.714761 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4jmx\" (UniqueName: \"kubernetes.io/projected/5835f976-c6b4-4bd9-9893-70905ce30872-kube-api-access-b4jmx\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.714794 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5835f976-c6b4-4bd9-9893-70905ce30872-ovs-rundir\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.714832 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5835f976-c6b4-4bd9-9893-70905ce30872-config\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.714918 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5835f976-c6b4-4bd9-9893-70905ce30872-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.742661 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-96bvs"] Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.743576 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.769458 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-q4m8b"] Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.774454 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.830482 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.832930 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5835f976-c6b4-4bd9-9893-70905ce30872-ovn-rundir\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.833175 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5835f976-c6b4-4bd9-9893-70905ce30872-combined-ca-bundle\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.833330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4jmx\" (UniqueName: \"kubernetes.io/projected/5835f976-c6b4-4bd9-9893-70905ce30872-kube-api-access-b4jmx\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.833433 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5835f976-c6b4-4bd9-9893-70905ce30872-ovs-rundir\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.833512 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5835f976-c6b4-4bd9-9893-70905ce30872-config\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.833861 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5835f976-c6b4-4bd9-9893-70905ce30872-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.835907 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5835f976-c6b4-4bd9-9893-70905ce30872-ovs-rundir\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.838437 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5835f976-c6b4-4bd9-9893-70905ce30872-config\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.839087 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5835f976-c6b4-4bd9-9893-70905ce30872-ovn-rundir\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.879871 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5835f976-c6b4-4bd9-9893-70905ce30872-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.921435 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4jmx\" (UniqueName: \"kubernetes.io/projected/5835f976-c6b4-4bd9-9893-70905ce30872-kube-api-access-b4jmx\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.922411 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5835f976-c6b4-4bd9-9893-70905ce30872-combined-ca-bundle\") pod \"ovn-controller-metrics-znflv\" (UID: \"5835f976-c6b4-4bd9-9893-70905ce30872\") " pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.947314 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-znflv" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.947878 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-q4m8b\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.947930 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-q4m8b\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.948035 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-config\") pod \"dnsmasq-dns-698758b865-q4m8b\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.948151 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-dns-svc\") pod \"dnsmasq-dns-698758b865-q4m8b\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.948187 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfrx8\" (UniqueName: \"kubernetes.io/projected/ad5e0061-9982-4b48-b8f3-877ecc734668-kube-api-access-qfrx8\") pod \"dnsmasq-dns-698758b865-q4m8b\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.950333 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-q4m8b"] Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.950363 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.951569 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.954954 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.955252 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.961946 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-g7px6" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.962920 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 25 16:14:51 crc kubenswrapper[4743]: I1125 16:14:51.997679 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.051891 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8crp\" (UniqueName: \"kubernetes.io/projected/eb743ab5-16ea-4be4-95ee-00a87767602e-kube-api-access-w8crp\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.052002 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-q4m8b\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.052031 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-q4m8b\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.052062 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb743ab5-16ea-4be4-95ee-00a87767602e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.052117 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-config\") pod \"dnsmasq-dns-698758b865-q4m8b\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.052150 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb743ab5-16ea-4be4-95ee-00a87767602e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.052179 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb743ab5-16ea-4be4-95ee-00a87767602e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.052200 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb743ab5-16ea-4be4-95ee-00a87767602e-scripts\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.052223 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb743ab5-16ea-4be4-95ee-00a87767602e-config\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.052249 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eb743ab5-16ea-4be4-95ee-00a87767602e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.052294 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-dns-svc\") pod \"dnsmasq-dns-698758b865-q4m8b\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.052318 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfrx8\" (UniqueName: \"kubernetes.io/projected/ad5e0061-9982-4b48-b8f3-877ecc734668-kube-api-access-qfrx8\") pod \"dnsmasq-dns-698758b865-q4m8b\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.054383 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-dns-svc\") pod \"dnsmasq-dns-698758b865-q4m8b\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.055020 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-q4m8b\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.056017 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-config\") pod \"dnsmasq-dns-698758b865-q4m8b\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.060851 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-q4m8b\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.084110 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfrx8\" (UniqueName: \"kubernetes.io/projected/ad5e0061-9982-4b48-b8f3-877ecc734668-kube-api-access-qfrx8\") pod \"dnsmasq-dns-698758b865-q4m8b\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.132808 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.153704 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb743ab5-16ea-4be4-95ee-00a87767602e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.153792 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb743ab5-16ea-4be4-95ee-00a87767602e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.153813 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb743ab5-16ea-4be4-95ee-00a87767602e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.153828 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb743ab5-16ea-4be4-95ee-00a87767602e-scripts\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.153850 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb743ab5-16ea-4be4-95ee-00a87767602e-config\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.153868 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eb743ab5-16ea-4be4-95ee-00a87767602e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.153927 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8crp\" (UniqueName: \"kubernetes.io/projected/eb743ab5-16ea-4be4-95ee-00a87767602e-kube-api-access-w8crp\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.158421 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb743ab5-16ea-4be4-95ee-00a87767602e-scripts\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.158508 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb743ab5-16ea-4be4-95ee-00a87767602e-config\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.158823 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eb743ab5-16ea-4be4-95ee-00a87767602e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.173419 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb743ab5-16ea-4be4-95ee-00a87767602e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.174293 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb743ab5-16ea-4be4-95ee-00a87767602e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.191372 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb743ab5-16ea-4be4-95ee-00a87767602e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.210167 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8crp\" (UniqueName: \"kubernetes.io/projected/eb743ab5-16ea-4be4-95ee-00a87767602e-kube-api-access-w8crp\") pod \"ovn-northd-0\" (UID: \"eb743ab5-16ea-4be4-95ee-00a87767602e\") " pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.212323 4743 generic.go:334] "Generic (PLEG): container finished" podID="5aeef8c3-165f-49a3-a0ac-90a59da51c35" containerID="e6a7df6fdfa5934b6dc5af5207d161f1df334f329e4783e625db87000e1e727a" exitCode=0 Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.212375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" event={"ID":"5aeef8c3-165f-49a3-a0ac-90a59da51c35","Type":"ContainerDied","Data":"e6a7df6fdfa5934b6dc5af5207d161f1df334f329e4783e625db87000e1e727a"} Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.212414 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" event={"ID":"5aeef8c3-165f-49a3-a0ac-90a59da51c35","Type":"ContainerDied","Data":"ecbd3961b07e5a3522cb37269c2f7ef5fe71b84ea80ded1c5dfdda48ca002f82"} Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.212452 4743 scope.go:117] "RemoveContainer" containerID="e6a7df6fdfa5934b6dc5af5207d161f1df334f329e4783e625db87000e1e727a" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.213120 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-9xzkx" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.257009 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgpkc\" (UniqueName: \"kubernetes.io/projected/5aeef8c3-165f-49a3-a0ac-90a59da51c35-kube-api-access-sgpkc\") pod \"5aeef8c3-165f-49a3-a0ac-90a59da51c35\" (UID: \"5aeef8c3-165f-49a3-a0ac-90a59da51c35\") " Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.257156 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aeef8c3-165f-49a3-a0ac-90a59da51c35-config\") pod \"5aeef8c3-165f-49a3-a0ac-90a59da51c35\" (UID: \"5aeef8c3-165f-49a3-a0ac-90a59da51c35\") " Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.257276 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aeef8c3-165f-49a3-a0ac-90a59da51c35-dns-svc\") pod \"5aeef8c3-165f-49a3-a0ac-90a59da51c35\" (UID: \"5aeef8c3-165f-49a3-a0ac-90a59da51c35\") " Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.262357 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aeef8c3-165f-49a3-a0ac-90a59da51c35-kube-api-access-sgpkc" (OuterVolumeSpecName: "kube-api-access-sgpkc") pod "5aeef8c3-165f-49a3-a0ac-90a59da51c35" (UID: "5aeef8c3-165f-49a3-a0ac-90a59da51c35"). InnerVolumeSpecName "kube-api-access-sgpkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.272366 4743 scope.go:117] "RemoveContainer" containerID="58958002a4f659ddaec318133239ffb4f7a60f606e3d03490289d80b5465b6fa" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.283454 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.307034 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.345205 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aeef8c3-165f-49a3-a0ac-90a59da51c35-config" (OuterVolumeSpecName: "config") pod "5aeef8c3-165f-49a3-a0ac-90a59da51c35" (UID: "5aeef8c3-165f-49a3-a0ac-90a59da51c35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.347380 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aeef8c3-165f-49a3-a0ac-90a59da51c35-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5aeef8c3-165f-49a3-a0ac-90a59da51c35" (UID: "5aeef8c3-165f-49a3-a0ac-90a59da51c35"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.361890 4743 scope.go:117] "RemoveContainer" containerID="e6a7df6fdfa5934b6dc5af5207d161f1df334f329e4783e625db87000e1e727a" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.362683 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aeef8c3-165f-49a3-a0ac-90a59da51c35-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.362702 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgpkc\" (UniqueName: \"kubernetes.io/projected/5aeef8c3-165f-49a3-a0ac-90a59da51c35-kube-api-access-sgpkc\") on node \"crc\" DevicePath \"\"" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.362712 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aeef8c3-165f-49a3-a0ac-90a59da51c35-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:14:52 crc kubenswrapper[4743]: E1125 16:14:52.380220 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6a7df6fdfa5934b6dc5af5207d161f1df334f329e4783e625db87000e1e727a\": container with ID starting with e6a7df6fdfa5934b6dc5af5207d161f1df334f329e4783e625db87000e1e727a not found: ID does not exist" containerID="e6a7df6fdfa5934b6dc5af5207d161f1df334f329e4783e625db87000e1e727a" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.380279 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6a7df6fdfa5934b6dc5af5207d161f1df334f329e4783e625db87000e1e727a"} err="failed to get container status \"e6a7df6fdfa5934b6dc5af5207d161f1df334f329e4783e625db87000e1e727a\": rpc error: code = NotFound desc = could not find container \"e6a7df6fdfa5934b6dc5af5207d161f1df334f329e4783e625db87000e1e727a\": container with ID starting with e6a7df6fdfa5934b6dc5af5207d161f1df334f329e4783e625db87000e1e727a not found: ID does not exist" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.380315 4743 scope.go:117] "RemoveContainer" containerID="58958002a4f659ddaec318133239ffb4f7a60f606e3d03490289d80b5465b6fa" Nov 25 16:14:52 crc kubenswrapper[4743]: E1125 16:14:52.381103 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58958002a4f659ddaec318133239ffb4f7a60f606e3d03490289d80b5465b6fa\": container with ID starting with 58958002a4f659ddaec318133239ffb4f7a60f606e3d03490289d80b5465b6fa not found: ID does not exist" containerID="58958002a4f659ddaec318133239ffb4f7a60f606e3d03490289d80b5465b6fa" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.381146 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58958002a4f659ddaec318133239ffb4f7a60f606e3d03490289d80b5465b6fa"} err="failed to get container status \"58958002a4f659ddaec318133239ffb4f7a60f606e3d03490289d80b5465b6fa\": rpc error: code = NotFound desc = could not find container \"58958002a4f659ddaec318133239ffb4f7a60f606e3d03490289d80b5465b6fa\": container with ID starting with 58958002a4f659ddaec318133239ffb4f7a60f606e3d03490289d80b5465b6fa not found: ID does not exist" Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.488840 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-96bvs"] Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.551101 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9xzkx"] Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.561576 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9xzkx"] Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.649487 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-znflv"] Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.832303 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-q4m8b"] Nov 25 16:14:52 crc kubenswrapper[4743]: W1125 16:14:52.852405 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad5e0061_9982_4b48_b8f3_877ecc734668.slice/crio-13b4a8922eae95b5807cca7537e75df4609e6364ac20faa5add1de3d3758c68c WatchSource:0}: Error finding container 13b4a8922eae95b5807cca7537e75df4609e6364ac20faa5add1de3d3758c68c: Status 404 returned error can't find the container with id 13b4a8922eae95b5807cca7537e75df4609e6364ac20faa5add1de3d3758c68c Nov 25 16:14:52 crc kubenswrapper[4743]: I1125 16:14:52.919648 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 25 16:14:52 crc kubenswrapper[4743]: W1125 16:14:52.924216 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb743ab5_16ea_4be4_95ee_00a87767602e.slice/crio-821f6fe7b7c2cfdb89dc13e95510deacdbc0b76f46e0e37c7dae31b7f56f908e WatchSource:0}: Error finding container 821f6fe7b7c2cfdb89dc13e95510deacdbc0b76f46e0e37c7dae31b7f56f908e: Status 404 returned error can't find the container with id 821f6fe7b7c2cfdb89dc13e95510deacdbc0b76f46e0e37c7dae31b7f56f908e Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.232672 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-znflv" event={"ID":"5835f976-c6b4-4bd9-9893-70905ce30872","Type":"ContainerStarted","Data":"2464eb3f60a1eb12ef7a0a8ed686e2d349b869bac5a51ce5b4bf95176eb22c5e"} Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.233020 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-znflv" event={"ID":"5835f976-c6b4-4bd9-9893-70905ce30872","Type":"ContainerStarted","Data":"f0eeac230bc3d7d21df7df45cd9b31def0f136f65bc5ae768500430107ee5802"} Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.241911 4743 generic.go:334] "Generic (PLEG): container finished" podID="387e3166-8fcf-4910-87ad-422372191cb9" containerID="65f26098c0ef4d02f118e7cd7f7e0795e41658bd97f4fb71c5b8fb38b70b9928" exitCode=0 Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.241998 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" event={"ID":"387e3166-8fcf-4910-87ad-422372191cb9","Type":"ContainerDied","Data":"65f26098c0ef4d02f118e7cd7f7e0795e41658bd97f4fb71c5b8fb38b70b9928"} Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.242045 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" event={"ID":"387e3166-8fcf-4910-87ad-422372191cb9","Type":"ContainerStarted","Data":"b42384ad087d8c1721c64c3340954acd5f305ccc95bdcac97888e649c91a7256"} Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.247315 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eb743ab5-16ea-4be4-95ee-00a87767602e","Type":"ContainerStarted","Data":"821f6fe7b7c2cfdb89dc13e95510deacdbc0b76f46e0e37c7dae31b7f56f908e"} Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.249198 4743 generic.go:334] "Generic (PLEG): container finished" podID="ad5e0061-9982-4b48-b8f3-877ecc734668" containerID="563b557dcd5b9df464916b719c0dbd176c28ce9c9a412b91a8c7f17d94700509" exitCode=0 Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.249474 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-q4m8b" event={"ID":"ad5e0061-9982-4b48-b8f3-877ecc734668","Type":"ContainerDied","Data":"563b557dcd5b9df464916b719c0dbd176c28ce9c9a412b91a8c7f17d94700509"} Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.249521 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-q4m8b" event={"ID":"ad5e0061-9982-4b48-b8f3-877ecc734668","Type":"ContainerStarted","Data":"13b4a8922eae95b5807cca7537e75df4609e6364ac20faa5add1de3d3758c68c"} Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.254172 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-znflv" podStartSLOduration=2.254159868 podStartE2EDuration="2.254159868s" podCreationTimestamp="2025-11-25 16:14:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:14:53.25103196 +0000 UTC m=+972.372871519" watchObservedRunningTime="2025-11-25 16:14:53.254159868 +0000 UTC m=+972.375999417" Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.598134 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.698646 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkvfd\" (UniqueName: \"kubernetes.io/projected/387e3166-8fcf-4910-87ad-422372191cb9-kube-api-access-pkvfd\") pod \"387e3166-8fcf-4910-87ad-422372191cb9\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.698746 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-dns-svc\") pod \"387e3166-8fcf-4910-87ad-422372191cb9\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.793802 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aeef8c3-165f-49a3-a0ac-90a59da51c35" path="/var/lib/kubelet/pods/5aeef8c3-165f-49a3-a0ac-90a59da51c35/volumes" Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.799659 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-config\") pod \"387e3166-8fcf-4910-87ad-422372191cb9\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.799902 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-ovsdbserver-nb\") pod \"387e3166-8fcf-4910-87ad-422372191cb9\" (UID: \"387e3166-8fcf-4910-87ad-422372191cb9\") " Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.872441 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "387e3166-8fcf-4910-87ad-422372191cb9" (UID: "387e3166-8fcf-4910-87ad-422372191cb9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.875173 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "387e3166-8fcf-4910-87ad-422372191cb9" (UID: "387e3166-8fcf-4910-87ad-422372191cb9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.876584 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-config" (OuterVolumeSpecName: "config") pod "387e3166-8fcf-4910-87ad-422372191cb9" (UID: "387e3166-8fcf-4910-87ad-422372191cb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.876906 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/387e3166-8fcf-4910-87ad-422372191cb9-kube-api-access-pkvfd" (OuterVolumeSpecName: "kube-api-access-pkvfd") pod "387e3166-8fcf-4910-87ad-422372191cb9" (UID: "387e3166-8fcf-4910-87ad-422372191cb9"). InnerVolumeSpecName "kube-api-access-pkvfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.902315 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.902345 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkvfd\" (UniqueName: \"kubernetes.io/projected/387e3166-8fcf-4910-87ad-422372191cb9-kube-api-access-pkvfd\") on node \"crc\" DevicePath \"\"" Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.902356 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:14:53 crc kubenswrapper[4743]: I1125 16:14:53.902365 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/387e3166-8fcf-4910-87ad-422372191cb9-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:14:54 crc kubenswrapper[4743]: I1125 16:14:54.267827 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" event={"ID":"387e3166-8fcf-4910-87ad-422372191cb9","Type":"ContainerDied","Data":"b42384ad087d8c1721c64c3340954acd5f305ccc95bdcac97888e649c91a7256"} Nov 25 16:14:54 crc kubenswrapper[4743]: I1125 16:14:54.267903 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-96bvs" Nov 25 16:14:54 crc kubenswrapper[4743]: I1125 16:14:54.267935 4743 scope.go:117] "RemoveContainer" containerID="65f26098c0ef4d02f118e7cd7f7e0795e41658bd97f4fb71c5b8fb38b70b9928" Nov 25 16:14:54 crc kubenswrapper[4743]: I1125 16:14:54.272013 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-q4m8b" event={"ID":"ad5e0061-9982-4b48-b8f3-877ecc734668","Type":"ContainerStarted","Data":"5209329b204f2ec2f717266243f9212ac731a17bd362f2758a97567e6a836f26"} Nov 25 16:14:54 crc kubenswrapper[4743]: I1125 16:14:54.310570 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-q4m8b" podStartSLOduration=3.310550835 podStartE2EDuration="3.310550835s" podCreationTimestamp="2025-11-25 16:14:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:14:54.303227685 +0000 UTC m=+973.425067234" watchObservedRunningTime="2025-11-25 16:14:54.310550835 +0000 UTC m=+973.432390384" Nov 25 16:14:54 crc kubenswrapper[4743]: I1125 16:14:54.368401 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-96bvs"] Nov 25 16:14:54 crc kubenswrapper[4743]: I1125 16:14:54.378280 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-96bvs"] Nov 25 16:14:55 crc kubenswrapper[4743]: I1125 16:14:55.280601 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eb743ab5-16ea-4be4-95ee-00a87767602e","Type":"ContainerStarted","Data":"cf92df16ce5e8547bd0142fd5e7d2c6af54edf2f0e4a3b98aaf8a6c67c07d6ec"} Nov 25 16:14:55 crc kubenswrapper[4743]: I1125 16:14:55.281003 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:14:55 crc kubenswrapper[4743]: I1125 16:14:55.281020 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eb743ab5-16ea-4be4-95ee-00a87767602e","Type":"ContainerStarted","Data":"8f38919f6be43af9d7dd004923aeeedbf0085a6597270b9acd20f0c6d0c09752"} Nov 25 16:14:55 crc kubenswrapper[4743]: I1125 16:14:55.299784 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.6549023309999997 podStartE2EDuration="4.299765813s" podCreationTimestamp="2025-11-25 16:14:51 +0000 UTC" firstStartedPulling="2025-11-25 16:14:52.927219354 +0000 UTC m=+972.049058903" lastFinishedPulling="2025-11-25 16:14:54.572082846 +0000 UTC m=+973.693922385" observedRunningTime="2025-11-25 16:14:55.297886164 +0000 UTC m=+974.419725733" watchObservedRunningTime="2025-11-25 16:14:55.299765813 +0000 UTC m=+974.421605362" Nov 25 16:14:55 crc kubenswrapper[4743]: I1125 16:14:55.784448 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="387e3166-8fcf-4910-87ad-422372191cb9" path="/var/lib/kubelet/pods/387e3166-8fcf-4910-87ad-422372191cb9/volumes" Nov 25 16:14:55 crc kubenswrapper[4743]: I1125 16:14:55.969877 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 25 16:14:55 crc kubenswrapper[4743]: I1125 16:14:55.969929 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 25 16:14:56 crc kubenswrapper[4743]: I1125 16:14:56.044730 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 25 16:14:56 crc kubenswrapper[4743]: I1125 16:14:56.287351 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 25 16:14:56 crc kubenswrapper[4743]: I1125 16:14:56.357624 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.367536 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-aa4a-account-create-mlttg"] Nov 25 16:14:57 crc kubenswrapper[4743]: E1125 16:14:57.368525 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aeef8c3-165f-49a3-a0ac-90a59da51c35" containerName="dnsmasq-dns" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.368612 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aeef8c3-165f-49a3-a0ac-90a59da51c35" containerName="dnsmasq-dns" Nov 25 16:14:57 crc kubenswrapper[4743]: E1125 16:14:57.368689 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aeef8c3-165f-49a3-a0ac-90a59da51c35" containerName="init" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.368744 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aeef8c3-165f-49a3-a0ac-90a59da51c35" containerName="init" Nov 25 16:14:57 crc kubenswrapper[4743]: E1125 16:14:57.368822 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="387e3166-8fcf-4910-87ad-422372191cb9" containerName="init" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.368912 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="387e3166-8fcf-4910-87ad-422372191cb9" containerName="init" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.369136 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="387e3166-8fcf-4910-87ad-422372191cb9" containerName="init" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.369210 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aeef8c3-165f-49a3-a0ac-90a59da51c35" containerName="dnsmasq-dns" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.369799 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-aa4a-account-create-mlttg" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.373926 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.377212 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-aa4a-account-create-mlttg"] Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.385636 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-2wn4j"] Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.387010 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2wn4j" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.412710 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2wn4j"] Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.445289 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.445339 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.484138 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ceca1a3-ffa3-4b15-bca5-c306720941f3-operator-scripts\") pod \"keystone-db-create-2wn4j\" (UID: \"1ceca1a3-ffa3-4b15-bca5-c306720941f3\") " pod="openstack/keystone-db-create-2wn4j" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.484199 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/436fe732-731a-4cb7-85d6-2d4e3ef48805-operator-scripts\") pod \"keystone-aa4a-account-create-mlttg\" (UID: \"436fe732-731a-4cb7-85d6-2d4e3ef48805\") " pod="openstack/keystone-aa4a-account-create-mlttg" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.484234 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rwns\" (UniqueName: \"kubernetes.io/projected/1ceca1a3-ffa3-4b15-bca5-c306720941f3-kube-api-access-4rwns\") pod \"keystone-db-create-2wn4j\" (UID: \"1ceca1a3-ffa3-4b15-bca5-c306720941f3\") " pod="openstack/keystone-db-create-2wn4j" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.484288 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhg74\" (UniqueName: \"kubernetes.io/projected/436fe732-731a-4cb7-85d6-2d4e3ef48805-kube-api-access-bhg74\") pod \"keystone-aa4a-account-create-mlttg\" (UID: \"436fe732-731a-4cb7-85d6-2d4e3ef48805\") " pod="openstack/keystone-aa4a-account-create-mlttg" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.585472 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhg74\" (UniqueName: \"kubernetes.io/projected/436fe732-731a-4cb7-85d6-2d4e3ef48805-kube-api-access-bhg74\") pod \"keystone-aa4a-account-create-mlttg\" (UID: \"436fe732-731a-4cb7-85d6-2d4e3ef48805\") " pod="openstack/keystone-aa4a-account-create-mlttg" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.585691 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ceca1a3-ffa3-4b15-bca5-c306720941f3-operator-scripts\") pod \"keystone-db-create-2wn4j\" (UID: \"1ceca1a3-ffa3-4b15-bca5-c306720941f3\") " pod="openstack/keystone-db-create-2wn4j" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.585711 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/436fe732-731a-4cb7-85d6-2d4e3ef48805-operator-scripts\") pod \"keystone-aa4a-account-create-mlttg\" (UID: \"436fe732-731a-4cb7-85d6-2d4e3ef48805\") " pod="openstack/keystone-aa4a-account-create-mlttg" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.585750 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rwns\" (UniqueName: \"kubernetes.io/projected/1ceca1a3-ffa3-4b15-bca5-c306720941f3-kube-api-access-4rwns\") pod \"keystone-db-create-2wn4j\" (UID: \"1ceca1a3-ffa3-4b15-bca5-c306720941f3\") " pod="openstack/keystone-db-create-2wn4j" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.587350 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ceca1a3-ffa3-4b15-bca5-c306720941f3-operator-scripts\") pod \"keystone-db-create-2wn4j\" (UID: \"1ceca1a3-ffa3-4b15-bca5-c306720941f3\") " pod="openstack/keystone-db-create-2wn4j" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.587901 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/436fe732-731a-4cb7-85d6-2d4e3ef48805-operator-scripts\") pod \"keystone-aa4a-account-create-mlttg\" (UID: \"436fe732-731a-4cb7-85d6-2d4e3ef48805\") " pod="openstack/keystone-aa4a-account-create-mlttg" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.613269 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhg74\" (UniqueName: \"kubernetes.io/projected/436fe732-731a-4cb7-85d6-2d4e3ef48805-kube-api-access-bhg74\") pod \"keystone-aa4a-account-create-mlttg\" (UID: \"436fe732-731a-4cb7-85d6-2d4e3ef48805\") " pod="openstack/keystone-aa4a-account-create-mlttg" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.627281 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rwns\" (UniqueName: \"kubernetes.io/projected/1ceca1a3-ffa3-4b15-bca5-c306720941f3-kube-api-access-4rwns\") pod \"keystone-db-create-2wn4j\" (UID: \"1ceca1a3-ffa3-4b15-bca5-c306720941f3\") " pod="openstack/keystone-db-create-2wn4j" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.661074 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-jtddz"] Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.662194 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jtddz" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.678411 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jtddz"] Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.695723 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-aa4a-account-create-mlttg" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.723046 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2wn4j" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.788387 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8jhg\" (UniqueName: \"kubernetes.io/projected/5903c1f3-91d8-4b0a-9680-d0f547df08c2-kube-api-access-k8jhg\") pod \"placement-db-create-jtddz\" (UID: \"5903c1f3-91d8-4b0a-9680-d0f547df08c2\") " pod="openstack/placement-db-create-jtddz" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.788886 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5903c1f3-91d8-4b0a-9680-d0f547df08c2-operator-scripts\") pod \"placement-db-create-jtddz\" (UID: \"5903c1f3-91d8-4b0a-9680-d0f547df08c2\") " pod="openstack/placement-db-create-jtddz" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.800370 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f4d9-account-create-pmmxn"] Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.801306 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f4d9-account-create-pmmxn" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.815675 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.826002 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f4d9-account-create-pmmxn"] Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.869515 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.892330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5903c1f3-91d8-4b0a-9680-d0f547df08c2-operator-scripts\") pod \"placement-db-create-jtddz\" (UID: \"5903c1f3-91d8-4b0a-9680-d0f547df08c2\") " pod="openstack/placement-db-create-jtddz" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.892826 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.893243 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8jhg\" (UniqueName: \"kubernetes.io/projected/5903c1f3-91d8-4b0a-9680-d0f547df08c2-kube-api-access-k8jhg\") pod \"placement-db-create-jtddz\" (UID: \"5903c1f3-91d8-4b0a-9680-d0f547df08c2\") " pod="openstack/placement-db-create-jtddz" Nov 25 16:14:57 crc kubenswrapper[4743]: E1125 16:14:57.893142 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 25 16:14:57 crc kubenswrapper[4743]: E1125 16:14:57.893468 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 25 16:14:57 crc kubenswrapper[4743]: E1125 16:14:57.893542 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift podName:9ae66928-3c05-4597-98a6-f663e9df7cff nodeName:}" failed. No retries permitted until 2025-11-25 16:15:13.893515357 +0000 UTC m=+993.015354926 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift") pod "swift-storage-0" (UID: "9ae66928-3c05-4597-98a6-f663e9df7cff") : configmap "swift-ring-files" not found Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.893552 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5903c1f3-91d8-4b0a-9680-d0f547df08c2-operator-scripts\") pod \"placement-db-create-jtddz\" (UID: \"5903c1f3-91d8-4b0a-9680-d0f547df08c2\") " pod="openstack/placement-db-create-jtddz" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.893711 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5tsj\" (UniqueName: \"kubernetes.io/projected/f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f-kube-api-access-j5tsj\") pod \"placement-f4d9-account-create-pmmxn\" (UID: \"f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f\") " pod="openstack/placement-f4d9-account-create-pmmxn" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.893872 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f-operator-scripts\") pod \"placement-f4d9-account-create-pmmxn\" (UID: \"f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f\") " pod="openstack/placement-f4d9-account-create-pmmxn" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.898957 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-2j7qx"] Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.900357 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2j7qx" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.926704 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2j7qx"] Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.951746 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8jhg\" (UniqueName: \"kubernetes.io/projected/5903c1f3-91d8-4b0a-9680-d0f547df08c2-kube-api-access-k8jhg\") pod \"placement-db-create-jtddz\" (UID: \"5903c1f3-91d8-4b0a-9680-d0f547df08c2\") " pod="openstack/placement-db-create-jtddz" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.996890 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qzm2\" (UniqueName: \"kubernetes.io/projected/f1ae1e82-cea4-4ab3-9cc7-7b2615288871-kube-api-access-4qzm2\") pod \"glance-db-create-2j7qx\" (UID: \"f1ae1e82-cea4-4ab3-9cc7-7b2615288871\") " pod="openstack/glance-db-create-2j7qx" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.997005 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5tsj\" (UniqueName: \"kubernetes.io/projected/f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f-kube-api-access-j5tsj\") pod \"placement-f4d9-account-create-pmmxn\" (UID: \"f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f\") " pod="openstack/placement-f4d9-account-create-pmmxn" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.997030 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ae1e82-cea4-4ab3-9cc7-7b2615288871-operator-scripts\") pod \"glance-db-create-2j7qx\" (UID: \"f1ae1e82-cea4-4ab3-9cc7-7b2615288871\") " pod="openstack/glance-db-create-2j7qx" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.997053 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f-operator-scripts\") pod \"placement-f4d9-account-create-pmmxn\" (UID: \"f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f\") " pod="openstack/placement-f4d9-account-create-pmmxn" Nov 25 16:14:57 crc kubenswrapper[4743]: I1125 16:14:57.998952 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f-operator-scripts\") pod \"placement-f4d9-account-create-pmmxn\" (UID: \"f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f\") " pod="openstack/placement-f4d9-account-create-pmmxn" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.000027 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jtddz" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.018343 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5tsj\" (UniqueName: \"kubernetes.io/projected/f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f-kube-api-access-j5tsj\") pod \"placement-f4d9-account-create-pmmxn\" (UID: \"f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f\") " pod="openstack/placement-f4d9-account-create-pmmxn" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.025419 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-0782-account-create-k5q7l"] Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.026662 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0782-account-create-k5q7l" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.032063 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.042377 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0782-account-create-k5q7l"] Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.099135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qzm2\" (UniqueName: \"kubernetes.io/projected/f1ae1e82-cea4-4ab3-9cc7-7b2615288871-kube-api-access-4qzm2\") pod \"glance-db-create-2j7qx\" (UID: \"f1ae1e82-cea4-4ab3-9cc7-7b2615288871\") " pod="openstack/glance-db-create-2j7qx" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.099664 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ae1e82-cea4-4ab3-9cc7-7b2615288871-operator-scripts\") pod \"glance-db-create-2j7qx\" (UID: \"f1ae1e82-cea4-4ab3-9cc7-7b2615288871\") " pod="openstack/glance-db-create-2j7qx" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.100385 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ae1e82-cea4-4ab3-9cc7-7b2615288871-operator-scripts\") pod \"glance-db-create-2j7qx\" (UID: \"f1ae1e82-cea4-4ab3-9cc7-7b2615288871\") " pod="openstack/glance-db-create-2j7qx" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.123399 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qzm2\" (UniqueName: \"kubernetes.io/projected/f1ae1e82-cea4-4ab3-9cc7-7b2615288871-kube-api-access-4qzm2\") pod \"glance-db-create-2j7qx\" (UID: \"f1ae1e82-cea4-4ab3-9cc7-7b2615288871\") " pod="openstack/glance-db-create-2j7qx" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.202028 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsgrg\" (UniqueName: \"kubernetes.io/projected/0a21ef5d-71ae-47ce-aa24-d49830887317-kube-api-access-nsgrg\") pod \"glance-0782-account-create-k5q7l\" (UID: \"0a21ef5d-71ae-47ce-aa24-d49830887317\") " pod="openstack/glance-0782-account-create-k5q7l" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.202111 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a21ef5d-71ae-47ce-aa24-d49830887317-operator-scripts\") pod \"glance-0782-account-create-k5q7l\" (UID: \"0a21ef5d-71ae-47ce-aa24-d49830887317\") " pod="openstack/glance-0782-account-create-k5q7l" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.212292 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f4d9-account-create-pmmxn" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.268633 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2j7qx" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.305154 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsgrg\" (UniqueName: \"kubernetes.io/projected/0a21ef5d-71ae-47ce-aa24-d49830887317-kube-api-access-nsgrg\") pod \"glance-0782-account-create-k5q7l\" (UID: \"0a21ef5d-71ae-47ce-aa24-d49830887317\") " pod="openstack/glance-0782-account-create-k5q7l" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.305280 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a21ef5d-71ae-47ce-aa24-d49830887317-operator-scripts\") pod \"glance-0782-account-create-k5q7l\" (UID: \"0a21ef5d-71ae-47ce-aa24-d49830887317\") " pod="openstack/glance-0782-account-create-k5q7l" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.306504 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a21ef5d-71ae-47ce-aa24-d49830887317-operator-scripts\") pod \"glance-0782-account-create-k5q7l\" (UID: \"0a21ef5d-71ae-47ce-aa24-d49830887317\") " pod="openstack/glance-0782-account-create-k5q7l" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.362866 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsgrg\" (UniqueName: \"kubernetes.io/projected/0a21ef5d-71ae-47ce-aa24-d49830887317-kube-api-access-nsgrg\") pod \"glance-0782-account-create-k5q7l\" (UID: \"0a21ef5d-71ae-47ce-aa24-d49830887317\") " pod="openstack/glance-0782-account-create-k5q7l" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.367653 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0782-account-create-k5q7l" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.395033 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2wn4j"] Nov 25 16:14:58 crc kubenswrapper[4743]: W1125 16:14:58.405042 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ceca1a3_ffa3_4b15_bca5_c306720941f3.slice/crio-46f9ab77c24a5c4f815b98fedbcfc22c4205bcc160a019f7a07daae3e66308e2 WatchSource:0}: Error finding container 46f9ab77c24a5c4f815b98fedbcfc22c4205bcc160a019f7a07daae3e66308e2: Status 404 returned error can't find the container with id 46f9ab77c24a5c4f815b98fedbcfc22c4205bcc160a019f7a07daae3e66308e2 Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.475139 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.500028 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-aa4a-account-create-mlttg"] Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.576860 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jtddz"] Nov 25 16:14:58 crc kubenswrapper[4743]: W1125 16:14:58.585666 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5903c1f3_91d8_4b0a_9680_d0f547df08c2.slice/crio-fdb5af72502d85b524b23c6a325dbd03c8be15eddf7cd5b56226666e4924284f WatchSource:0}: Error finding container fdb5af72502d85b524b23c6a325dbd03c8be15eddf7cd5b56226666e4924284f: Status 404 returned error can't find the container with id fdb5af72502d85b524b23c6a325dbd03c8be15eddf7cd5b56226666e4924284f Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.785380 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f4d9-account-create-pmmxn"] Nov 25 16:14:58 crc kubenswrapper[4743]: I1125 16:14:58.906527 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2j7qx"] Nov 25 16:14:58 crc kubenswrapper[4743]: W1125 16:14:58.911078 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1ae1e82_cea4_4ab3_9cc7_7b2615288871.slice/crio-39e1cff34489f1059cfacab33f0e7325a00cee6ebf9df8552673e90abd232897 WatchSource:0}: Error finding container 39e1cff34489f1059cfacab33f0e7325a00cee6ebf9df8552673e90abd232897: Status 404 returned error can't find the container with id 39e1cff34489f1059cfacab33f0e7325a00cee6ebf9df8552673e90abd232897 Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.025324 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0782-account-create-k5q7l"] Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.314775 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8dtsl" podUID="7750901a-7566-4d94-8cb5-5aff66e22116" containerName="ovn-controller" probeResult="failure" output=< Nov 25 16:14:59 crc kubenswrapper[4743]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 25 16:14:59 crc kubenswrapper[4743]: > Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.314848 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0782-account-create-k5q7l" event={"ID":"0a21ef5d-71ae-47ce-aa24-d49830887317","Type":"ContainerStarted","Data":"45f5ef28aebece070382f1342dc3d4c003a2f4b5752bd7119ec51a0625709e56"} Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.314898 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0782-account-create-k5q7l" event={"ID":"0a21ef5d-71ae-47ce-aa24-d49830887317","Type":"ContainerStarted","Data":"709de329154803fd0ceffb0b1b02b3d87e3b4a4acbdf36ec036f31601a284c32"} Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.317107 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-aa4a-account-create-mlttg" event={"ID":"436fe732-731a-4cb7-85d6-2d4e3ef48805","Type":"ContainerStarted","Data":"9590cbdbe696c1fc0327f0b4b653113088098e0bd7decd969a63930ee20f2078"} Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.317148 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-aa4a-account-create-mlttg" event={"ID":"436fe732-731a-4cb7-85d6-2d4e3ef48805","Type":"ContainerStarted","Data":"8abb7c553f139b16e43a48508cc0651a006d3ecfb224338309de110758ad69f8"} Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.319098 4743 generic.go:334] "Generic (PLEG): container finished" podID="f5eff179-2afc-4ec2-addc-31c3c36a6fd7" containerID="840b5269267debf5ba2619371f678be12d6462f7e5b7102d9c467b13f8a75834" exitCode=0 Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.319175 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m926g" event={"ID":"f5eff179-2afc-4ec2-addc-31c3c36a6fd7","Type":"ContainerDied","Data":"840b5269267debf5ba2619371f678be12d6462f7e5b7102d9c467b13f8a75834"} Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.320721 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2j7qx" event={"ID":"f1ae1e82-cea4-4ab3-9cc7-7b2615288871","Type":"ContainerStarted","Data":"3911a14700937f91794271ed381f84a89e399c3898b17ee85e44c780807dbf2f"} Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.320765 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2j7qx" event={"ID":"f1ae1e82-cea4-4ab3-9cc7-7b2615288871","Type":"ContainerStarted","Data":"39e1cff34489f1059cfacab33f0e7325a00cee6ebf9df8552673e90abd232897"} Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.321820 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jtddz" event={"ID":"5903c1f3-91d8-4b0a-9680-d0f547df08c2","Type":"ContainerStarted","Data":"5d7ec6230ac2f515462dce23f77de838c2c0e0ebdc003696878e7871d27bcb44"} Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.321846 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jtddz" event={"ID":"5903c1f3-91d8-4b0a-9680-d0f547df08c2","Type":"ContainerStarted","Data":"fdb5af72502d85b524b23c6a325dbd03c8be15eddf7cd5b56226666e4924284f"} Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.324228 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f4d9-account-create-pmmxn" event={"ID":"f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f","Type":"ContainerStarted","Data":"27df768c416be8f775a14d1b8637d6485231ade77f7b2d1674ce086c86fd22be"} Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.324274 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f4d9-account-create-pmmxn" event={"ID":"f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f","Type":"ContainerStarted","Data":"c27a48ab2d917ba28e1ced742b3b78d3952889bfc066c1c3de3df672a562b0db"} Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.325534 4743 generic.go:334] "Generic (PLEG): container finished" podID="1ceca1a3-ffa3-4b15-bca5-c306720941f3" containerID="b4044e626635467d25ee0ce321b3dc9d51cd2b8647513b802b8a5b2ba62fdfcf" exitCode=0 Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.325652 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2wn4j" event={"ID":"1ceca1a3-ffa3-4b15-bca5-c306720941f3","Type":"ContainerDied","Data":"b4044e626635467d25ee0ce321b3dc9d51cd2b8647513b802b8a5b2ba62fdfcf"} Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.325811 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2wn4j" event={"ID":"1ceca1a3-ffa3-4b15-bca5-c306720941f3","Type":"ContainerStarted","Data":"46f9ab77c24a5c4f815b98fedbcfc22c4205bcc160a019f7a07daae3e66308e2"} Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.336795 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-0782-account-create-k5q7l" podStartSLOduration=2.33677291 podStartE2EDuration="2.33677291s" podCreationTimestamp="2025-11-25 16:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:14:59.33169175 +0000 UTC m=+978.453531319" watchObservedRunningTime="2025-11-25 16:14:59.33677291 +0000 UTC m=+978.458612459" Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.364918 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-aa4a-account-create-mlttg" podStartSLOduration=2.364884573 podStartE2EDuration="2.364884573s" podCreationTimestamp="2025-11-25 16:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:14:59.363566002 +0000 UTC m=+978.485405561" watchObservedRunningTime="2025-11-25 16:14:59.364884573 +0000 UTC m=+978.486724132" Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.389927 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-2j7qx" podStartSLOduration=2.389906608 podStartE2EDuration="2.389906608s" podCreationTimestamp="2025-11-25 16:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:14:59.381419672 +0000 UTC m=+978.503259241" watchObservedRunningTime="2025-11-25 16:14:59.389906608 +0000 UTC m=+978.511746157" Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.404769 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-jtddz" podStartSLOduration=2.404747554 podStartE2EDuration="2.404747554s" podCreationTimestamp="2025-11-25 16:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:14:59.399093637 +0000 UTC m=+978.520933186" watchObservedRunningTime="2025-11-25 16:14:59.404747554 +0000 UTC m=+978.526587103" Nov 25 16:14:59 crc kubenswrapper[4743]: I1125 16:14:59.438215 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-f4d9-account-create-pmmxn" podStartSLOduration=2.438188554 podStartE2EDuration="2.438188554s" podCreationTimestamp="2025-11-25 16:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:14:59.430301497 +0000 UTC m=+978.552141066" watchObservedRunningTime="2025-11-25 16:14:59.438188554 +0000 UTC m=+978.560028103" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.154221 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56"] Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.155753 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.159938 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.160623 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.161382 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56"] Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.236669 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsnfl\" (UniqueName: \"kubernetes.io/projected/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-kube-api-access-lsnfl\") pod \"collect-profiles-29401455-nvm56\" (UID: \"d70c16b3-cf20-4bfe-a816-00d5fd9c2885\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.236997 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-config-volume\") pod \"collect-profiles-29401455-nvm56\" (UID: \"d70c16b3-cf20-4bfe-a816-00d5fd9c2885\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.237143 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-secret-volume\") pod \"collect-profiles-29401455-nvm56\" (UID: \"d70c16b3-cf20-4bfe-a816-00d5fd9c2885\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.338062 4743 generic.go:334] "Generic (PLEG): container finished" podID="f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f" containerID="27df768c416be8f775a14d1b8637d6485231ade77f7b2d1674ce086c86fd22be" exitCode=0 Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.338158 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f4d9-account-create-pmmxn" event={"ID":"f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f","Type":"ContainerDied","Data":"27df768c416be8f775a14d1b8637d6485231ade77f7b2d1674ce086c86fd22be"} Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.347248 4743 generic.go:334] "Generic (PLEG): container finished" podID="436fe732-731a-4cb7-85d6-2d4e3ef48805" containerID="9590cbdbe696c1fc0327f0b4b653113088098e0bd7decd969a63930ee20f2078" exitCode=0 Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.347357 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-config-volume\") pod \"collect-profiles-29401455-nvm56\" (UID: \"d70c16b3-cf20-4bfe-a816-00d5fd9c2885\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.347489 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-aa4a-account-create-mlttg" event={"ID":"436fe732-731a-4cb7-85d6-2d4e3ef48805","Type":"ContainerDied","Data":"9590cbdbe696c1fc0327f0b4b653113088098e0bd7decd969a63930ee20f2078"} Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.347642 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-secret-volume\") pod \"collect-profiles-29401455-nvm56\" (UID: \"d70c16b3-cf20-4bfe-a816-00d5fd9c2885\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.348070 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsnfl\" (UniqueName: \"kubernetes.io/projected/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-kube-api-access-lsnfl\") pod \"collect-profiles-29401455-nvm56\" (UID: \"d70c16b3-cf20-4bfe-a816-00d5fd9c2885\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.349341 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-config-volume\") pod \"collect-profiles-29401455-nvm56\" (UID: \"d70c16b3-cf20-4bfe-a816-00d5fd9c2885\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.356422 4743 generic.go:334] "Generic (PLEG): container finished" podID="5903c1f3-91d8-4b0a-9680-d0f547df08c2" containerID="5d7ec6230ac2f515462dce23f77de838c2c0e0ebdc003696878e7871d27bcb44" exitCode=0 Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.360235 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jtddz" event={"ID":"5903c1f3-91d8-4b0a-9680-d0f547df08c2","Type":"ContainerDied","Data":"5d7ec6230ac2f515462dce23f77de838c2c0e0ebdc003696878e7871d27bcb44"} Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.362420 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-secret-volume\") pod \"collect-profiles-29401455-nvm56\" (UID: \"d70c16b3-cf20-4bfe-a816-00d5fd9c2885\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.377630 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsnfl\" (UniqueName: \"kubernetes.io/projected/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-kube-api-access-lsnfl\") pod \"collect-profiles-29401455-nvm56\" (UID: \"d70c16b3-cf20-4bfe-a816-00d5fd9c2885\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.483937 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.816414 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2wn4j" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.833465 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.964804 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-scripts\") pod \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.964935 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-etc-swift\") pod \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.964959 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-dispersionconf\") pod \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.964975 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbq8s\" (UniqueName: \"kubernetes.io/projected/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-kube-api-access-nbq8s\") pod \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.965000 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rwns\" (UniqueName: \"kubernetes.io/projected/1ceca1a3-ffa3-4b15-bca5-c306720941f3-kube-api-access-4rwns\") pod \"1ceca1a3-ffa3-4b15-bca5-c306720941f3\" (UID: \"1ceca1a3-ffa3-4b15-bca5-c306720941f3\") " Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.965067 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-ring-data-devices\") pod \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.965130 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-combined-ca-bundle\") pod \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.965151 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-swiftconf\") pod \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\" (UID: \"f5eff179-2afc-4ec2-addc-31c3c36a6fd7\") " Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.965178 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ceca1a3-ffa3-4b15-bca5-c306720941f3-operator-scripts\") pod \"1ceca1a3-ffa3-4b15-bca5-c306720941f3\" (UID: \"1ceca1a3-ffa3-4b15-bca5-c306720941f3\") " Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.966269 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ceca1a3-ffa3-4b15-bca5-c306720941f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ceca1a3-ffa3-4b15-bca5-c306720941f3" (UID: "1ceca1a3-ffa3-4b15-bca5-c306720941f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.967256 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f5eff179-2afc-4ec2-addc-31c3c36a6fd7" (UID: "f5eff179-2afc-4ec2-addc-31c3c36a6fd7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.967278 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f5eff179-2afc-4ec2-addc-31c3c36a6fd7" (UID: "f5eff179-2afc-4ec2-addc-31c3c36a6fd7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.975028 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ceca1a3-ffa3-4b15-bca5-c306720941f3-kube-api-access-4rwns" (OuterVolumeSpecName: "kube-api-access-4rwns") pod "1ceca1a3-ffa3-4b15-bca5-c306720941f3" (UID: "1ceca1a3-ffa3-4b15-bca5-c306720941f3"). InnerVolumeSpecName "kube-api-access-4rwns". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.976056 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-kube-api-access-nbq8s" (OuterVolumeSpecName: "kube-api-access-nbq8s") pod "f5eff179-2afc-4ec2-addc-31c3c36a6fd7" (UID: "f5eff179-2afc-4ec2-addc-31c3c36a6fd7"). InnerVolumeSpecName "kube-api-access-nbq8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.979384 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f5eff179-2afc-4ec2-addc-31c3c36a6fd7" (UID: "f5eff179-2afc-4ec2-addc-31c3c36a6fd7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.980750 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56"] Nov 25 16:15:00 crc kubenswrapper[4743]: W1125 16:15:00.990400 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd70c16b3_cf20_4bfe_a816_00d5fd9c2885.slice/crio-cdb0eba7780e4620217b0d9294a156cee715bcd11def550253ca0d827ae95954 WatchSource:0}: Error finding container cdb0eba7780e4620217b0d9294a156cee715bcd11def550253ca0d827ae95954: Status 404 returned error can't find the container with id cdb0eba7780e4620217b0d9294a156cee715bcd11def550253ca0d827ae95954 Nov 25 16:15:00 crc kubenswrapper[4743]: I1125 16:15:00.998497 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-scripts" (OuterVolumeSpecName: "scripts") pod "f5eff179-2afc-4ec2-addc-31c3c36a6fd7" (UID: "f5eff179-2afc-4ec2-addc-31c3c36a6fd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.001740 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5eff179-2afc-4ec2-addc-31c3c36a6fd7" (UID: "f5eff179-2afc-4ec2-addc-31c3c36a6fd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.005435 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f5eff179-2afc-4ec2-addc-31c3c36a6fd7" (UID: "f5eff179-2afc-4ec2-addc-31c3c36a6fd7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.067348 4743 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.067371 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.067381 4743 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.067393 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ceca1a3-ffa3-4b15-bca5-c306720941f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.067402 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.067411 4743 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.067419 4743 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.067430 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbq8s\" (UniqueName: \"kubernetes.io/projected/f5eff179-2afc-4ec2-addc-31c3c36a6fd7-kube-api-access-nbq8s\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.067439 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rwns\" (UniqueName: \"kubernetes.io/projected/1ceca1a3-ffa3-4b15-bca5-c306720941f3-kube-api-access-4rwns\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.367303 4743 generic.go:334] "Generic (PLEG): container finished" podID="f1ae1e82-cea4-4ab3-9cc7-7b2615288871" containerID="3911a14700937f91794271ed381f84a89e399c3898b17ee85e44c780807dbf2f" exitCode=0 Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.367414 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2j7qx" event={"ID":"f1ae1e82-cea4-4ab3-9cc7-7b2615288871","Type":"ContainerDied","Data":"3911a14700937f91794271ed381f84a89e399c3898b17ee85e44c780807dbf2f"} Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.369111 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" event={"ID":"d70c16b3-cf20-4bfe-a816-00d5fd9c2885","Type":"ContainerStarted","Data":"9cc92ade8000551cfd97686c8040d267c296253f7248447560dd0af4dfa0e206"} Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.369142 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" event={"ID":"d70c16b3-cf20-4bfe-a816-00d5fd9c2885","Type":"ContainerStarted","Data":"cdb0eba7780e4620217b0d9294a156cee715bcd11def550253ca0d827ae95954"} Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.370898 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2wn4j" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.370918 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2wn4j" event={"ID":"1ceca1a3-ffa3-4b15-bca5-c306720941f3","Type":"ContainerDied","Data":"46f9ab77c24a5c4f815b98fedbcfc22c4205bcc160a019f7a07daae3e66308e2"} Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.370973 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46f9ab77c24a5c4f815b98fedbcfc22c4205bcc160a019f7a07daae3e66308e2" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.373440 4743 generic.go:334] "Generic (PLEG): container finished" podID="0a21ef5d-71ae-47ce-aa24-d49830887317" containerID="45f5ef28aebece070382f1342dc3d4c003a2f4b5752bd7119ec51a0625709e56" exitCode=0 Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.373498 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0782-account-create-k5q7l" event={"ID":"0a21ef5d-71ae-47ce-aa24-d49830887317","Type":"ContainerDied","Data":"45f5ef28aebece070382f1342dc3d4c003a2f4b5752bd7119ec51a0625709e56"} Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.381261 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m926g" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.381995 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m926g" event={"ID":"f5eff179-2afc-4ec2-addc-31c3c36a6fd7","Type":"ContainerDied","Data":"cd71c069f6c31153f84908ebccb319c5f27b31eb7db8e5b6b7aeae9f46dc0af5"} Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.382061 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd71c069f6c31153f84908ebccb319c5f27b31eb7db8e5b6b7aeae9f46dc0af5" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.487860 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" podStartSLOduration=1.487840246 podStartE2EDuration="1.487840246s" podCreationTimestamp="2025-11-25 16:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:15:01.468502399 +0000 UTC m=+980.590341958" watchObservedRunningTime="2025-11-25 16:15:01.487840246 +0000 UTC m=+980.609679795" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.904098 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f4d9-account-create-pmmxn" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.987271 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f-operator-scripts\") pod \"f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f\" (UID: \"f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f\") " Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.987422 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5tsj\" (UniqueName: \"kubernetes.io/projected/f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f-kube-api-access-j5tsj\") pod \"f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f\" (UID: \"f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f\") " Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.988627 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f" (UID: "f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:01 crc kubenswrapper[4743]: I1125 16:15:01.995147 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f-kube-api-access-j5tsj" (OuterVolumeSpecName: "kube-api-access-j5tsj") pod "f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f" (UID: "f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f"). InnerVolumeSpecName "kube-api-access-j5tsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.054343 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-aa4a-account-create-mlttg" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.059383 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jtddz" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.090775 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.090809 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5tsj\" (UniqueName: \"kubernetes.io/projected/f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f-kube-api-access-j5tsj\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.191282 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5903c1f3-91d8-4b0a-9680-d0f547df08c2-operator-scripts\") pod \"5903c1f3-91d8-4b0a-9680-d0f547df08c2\" (UID: \"5903c1f3-91d8-4b0a-9680-d0f547df08c2\") " Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.191366 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/436fe732-731a-4cb7-85d6-2d4e3ef48805-operator-scripts\") pod \"436fe732-731a-4cb7-85d6-2d4e3ef48805\" (UID: \"436fe732-731a-4cb7-85d6-2d4e3ef48805\") " Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.191480 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhg74\" (UniqueName: \"kubernetes.io/projected/436fe732-731a-4cb7-85d6-2d4e3ef48805-kube-api-access-bhg74\") pod \"436fe732-731a-4cb7-85d6-2d4e3ef48805\" (UID: \"436fe732-731a-4cb7-85d6-2d4e3ef48805\") " Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.191518 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8jhg\" (UniqueName: \"kubernetes.io/projected/5903c1f3-91d8-4b0a-9680-d0f547df08c2-kube-api-access-k8jhg\") pod \"5903c1f3-91d8-4b0a-9680-d0f547df08c2\" (UID: \"5903c1f3-91d8-4b0a-9680-d0f547df08c2\") " Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.191898 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/436fe732-731a-4cb7-85d6-2d4e3ef48805-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "436fe732-731a-4cb7-85d6-2d4e3ef48805" (UID: "436fe732-731a-4cb7-85d6-2d4e3ef48805"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.192011 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5903c1f3-91d8-4b0a-9680-d0f547df08c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5903c1f3-91d8-4b0a-9680-d0f547df08c2" (UID: "5903c1f3-91d8-4b0a-9680-d0f547df08c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.192143 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/436fe732-731a-4cb7-85d6-2d4e3ef48805-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.196242 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/436fe732-731a-4cb7-85d6-2d4e3ef48805-kube-api-access-bhg74" (OuterVolumeSpecName: "kube-api-access-bhg74") pod "436fe732-731a-4cb7-85d6-2d4e3ef48805" (UID: "436fe732-731a-4cb7-85d6-2d4e3ef48805"). InnerVolumeSpecName "kube-api-access-bhg74". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.197843 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5903c1f3-91d8-4b0a-9680-d0f547df08c2-kube-api-access-k8jhg" (OuterVolumeSpecName: "kube-api-access-k8jhg") pod "5903c1f3-91d8-4b0a-9680-d0f547df08c2" (UID: "5903c1f3-91d8-4b0a-9680-d0f547df08c2"). InnerVolumeSpecName "kube-api-access-k8jhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.286781 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.294024 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhg74\" (UniqueName: \"kubernetes.io/projected/436fe732-731a-4cb7-85d6-2d4e3ef48805-kube-api-access-bhg74\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.294241 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8jhg\" (UniqueName: \"kubernetes.io/projected/5903c1f3-91d8-4b0a-9680-d0f547df08c2-kube-api-access-k8jhg\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.294320 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5903c1f3-91d8-4b0a-9680-d0f547df08c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.363267 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8jpbt"] Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.363504 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" podUID="db79aaa9-4b33-4db5-991f-9a4b3aee84ae" containerName="dnsmasq-dns" containerID="cri-o://3818a0a0738abe4adae403a3161611c5df4158ff3518ae3e343826dc49bd2be2" gracePeriod=10 Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.398079 4743 generic.go:334] "Generic (PLEG): container finished" podID="32600c5f-46d2-441f-bda1-2ca9e0c35f35" containerID="e571dd70fc87a20ea9646cd7b5a93fac691cefd57db0cf105731fd3bd0dd22d9" exitCode=0 Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.398207 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"32600c5f-46d2-441f-bda1-2ca9e0c35f35","Type":"ContainerDied","Data":"e571dd70fc87a20ea9646cd7b5a93fac691cefd57db0cf105731fd3bd0dd22d9"} Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.403636 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-aa4a-account-create-mlttg" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.404069 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-aa4a-account-create-mlttg" event={"ID":"436fe732-731a-4cb7-85d6-2d4e3ef48805","Type":"ContainerDied","Data":"8abb7c553f139b16e43a48508cc0651a006d3ecfb224338309de110758ad69f8"} Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.404096 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8abb7c553f139b16e43a48508cc0651a006d3ecfb224338309de110758ad69f8" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.408323 4743 generic.go:334] "Generic (PLEG): container finished" podID="99b737b1-8d17-4abc-a898-1ceedff80421" containerID="45011f800605d00f351de8a2e8909dedde2295779fa6c1754dd8036fcd5511e6" exitCode=0 Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.408473 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"99b737b1-8d17-4abc-a898-1ceedff80421","Type":"ContainerDied","Data":"45011f800605d00f351de8a2e8909dedde2295779fa6c1754dd8036fcd5511e6"} Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.411802 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jtddz" event={"ID":"5903c1f3-91d8-4b0a-9680-d0f547df08c2","Type":"ContainerDied","Data":"fdb5af72502d85b524b23c6a325dbd03c8be15eddf7cd5b56226666e4924284f"} Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.411857 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdb5af72502d85b524b23c6a325dbd03c8be15eddf7cd5b56226666e4924284f" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.411929 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jtddz" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.418741 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f4d9-account-create-pmmxn" event={"ID":"f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f","Type":"ContainerDied","Data":"c27a48ab2d917ba28e1ced742b3b78d3952889bfc066c1c3de3df672a562b0db"} Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.418801 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c27a48ab2d917ba28e1ced742b3b78d3952889bfc066c1c3de3df672a562b0db" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.418886 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f4d9-account-create-pmmxn" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.434128 4743 generic.go:334] "Generic (PLEG): container finished" podID="d70c16b3-cf20-4bfe-a816-00d5fd9c2885" containerID="9cc92ade8000551cfd97686c8040d267c296253f7248447560dd0af4dfa0e206" exitCode=0 Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.434488 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" event={"ID":"d70c16b3-cf20-4bfe-a816-00d5fd9c2885","Type":"ContainerDied","Data":"9cc92ade8000551cfd97686c8040d267c296253f7248447560dd0af4dfa0e206"} Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.959581 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0782-account-create-k5q7l" Nov 25 16:15:02 crc kubenswrapper[4743]: I1125 16:15:02.966539 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2j7qx" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.006038 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsgrg\" (UniqueName: \"kubernetes.io/projected/0a21ef5d-71ae-47ce-aa24-d49830887317-kube-api-access-nsgrg\") pod \"0a21ef5d-71ae-47ce-aa24-d49830887317\" (UID: \"0a21ef5d-71ae-47ce-aa24-d49830887317\") " Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.006232 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a21ef5d-71ae-47ce-aa24-d49830887317-operator-scripts\") pod \"0a21ef5d-71ae-47ce-aa24-d49830887317\" (UID: \"0a21ef5d-71ae-47ce-aa24-d49830887317\") " Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.007387 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a21ef5d-71ae-47ce-aa24-d49830887317-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0a21ef5d-71ae-47ce-aa24-d49830887317" (UID: "0a21ef5d-71ae-47ce-aa24-d49830887317"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.011362 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a21ef5d-71ae-47ce-aa24-d49830887317-kube-api-access-nsgrg" (OuterVolumeSpecName: "kube-api-access-nsgrg") pod "0a21ef5d-71ae-47ce-aa24-d49830887317" (UID: "0a21ef5d-71ae-47ce-aa24-d49830887317"). InnerVolumeSpecName "kube-api-access-nsgrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.107910 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ae1e82-cea4-4ab3-9cc7-7b2615288871-operator-scripts\") pod \"f1ae1e82-cea4-4ab3-9cc7-7b2615288871\" (UID: \"f1ae1e82-cea4-4ab3-9cc7-7b2615288871\") " Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.108120 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qzm2\" (UniqueName: \"kubernetes.io/projected/f1ae1e82-cea4-4ab3-9cc7-7b2615288871-kube-api-access-4qzm2\") pod \"f1ae1e82-cea4-4ab3-9cc7-7b2615288871\" (UID: \"f1ae1e82-cea4-4ab3-9cc7-7b2615288871\") " Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.108515 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1ae1e82-cea4-4ab3-9cc7-7b2615288871-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1ae1e82-cea4-4ab3-9cc7-7b2615288871" (UID: "f1ae1e82-cea4-4ab3-9cc7-7b2615288871"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.108675 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsgrg\" (UniqueName: \"kubernetes.io/projected/0a21ef5d-71ae-47ce-aa24-d49830887317-kube-api-access-nsgrg\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.108698 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ae1e82-cea4-4ab3-9cc7-7b2615288871-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.108712 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a21ef5d-71ae-47ce-aa24-d49830887317-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.114780 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ae1e82-cea4-4ab3-9cc7-7b2615288871-kube-api-access-4qzm2" (OuterVolumeSpecName: "kube-api-access-4qzm2") pod "f1ae1e82-cea4-4ab3-9cc7-7b2615288871" (UID: "f1ae1e82-cea4-4ab3-9cc7-7b2615288871"). InnerVolumeSpecName "kube-api-access-4qzm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.211163 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qzm2\" (UniqueName: \"kubernetes.io/projected/f1ae1e82-cea4-4ab3-9cc7-7b2615288871-kube-api-access-4qzm2\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.403752 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.442864 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2j7qx" event={"ID":"f1ae1e82-cea4-4ab3-9cc7-7b2615288871","Type":"ContainerDied","Data":"39e1cff34489f1059cfacab33f0e7325a00cee6ebf9df8552673e90abd232897"} Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.443463 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39e1cff34489f1059cfacab33f0e7325a00cee6ebf9df8552673e90abd232897" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.443377 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2j7qx" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.445780 4743 generic.go:334] "Generic (PLEG): container finished" podID="db79aaa9-4b33-4db5-991f-9a4b3aee84ae" containerID="3818a0a0738abe4adae403a3161611c5df4158ff3518ae3e343826dc49bd2be2" exitCode=0 Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.445850 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" event={"ID":"db79aaa9-4b33-4db5-991f-9a4b3aee84ae","Type":"ContainerDied","Data":"3818a0a0738abe4adae403a3161611c5df4158ff3518ae3e343826dc49bd2be2"} Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.445959 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" event={"ID":"db79aaa9-4b33-4db5-991f-9a4b3aee84ae","Type":"ContainerDied","Data":"74d074321693d925aa187b166738ec4c824e7d4f8312e4b19c8139eebdbb0483"} Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.445981 4743 scope.go:117] "RemoveContainer" containerID="3818a0a0738abe4adae403a3161611c5df4158ff3518ae3e343826dc49bd2be2" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.445874 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8jpbt" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.449978 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"32600c5f-46d2-441f-bda1-2ca9e0c35f35","Type":"ContainerStarted","Data":"43ab4b9eeb6f340ae519a2c0e63dcb9fe4e8bb8363b26f248c8b28b89464c584"} Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.450234 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.458100 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0782-account-create-k5q7l" event={"ID":"0a21ef5d-71ae-47ce-aa24-d49830887317","Type":"ContainerDied","Data":"709de329154803fd0ceffb0b1b02b3d87e3b4a4acbdf36ec036f31601a284c32"} Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.458142 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="709de329154803fd0ceffb0b1b02b3d87e3b4a4acbdf36ec036f31601a284c32" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.458210 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0782-account-create-k5q7l" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.473655 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"99b737b1-8d17-4abc-a898-1ceedff80421","Type":"ContainerStarted","Data":"ad6e3c2cc475c993ab8907a59b69de29a261d63bfc818e982496f6b89ca379c6"} Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.474305 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.474748 4743 scope.go:117] "RemoveContainer" containerID="d1bbc4b02ffca1c683e7ac9994288b910d7a5750bac721b4fd1b54711d7a1e6e" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.508533 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.71607338 podStartE2EDuration="1m0.508511487s" podCreationTimestamp="2025-11-25 16:14:03 +0000 UTC" firstStartedPulling="2025-11-25 16:14:17.688452774 +0000 UTC m=+936.810292313" lastFinishedPulling="2025-11-25 16:14:27.480890871 +0000 UTC m=+946.602730420" observedRunningTime="2025-11-25 16:15:03.499549916 +0000 UTC m=+982.621389495" watchObservedRunningTime="2025-11-25 16:15:03.508511487 +0000 UTC m=+982.630351046" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.516001 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4lmb\" (UniqueName: \"kubernetes.io/projected/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-kube-api-access-t4lmb\") pod \"db79aaa9-4b33-4db5-991f-9a4b3aee84ae\" (UID: \"db79aaa9-4b33-4db5-991f-9a4b3aee84ae\") " Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.516071 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-config\") pod \"db79aaa9-4b33-4db5-991f-9a4b3aee84ae\" (UID: \"db79aaa9-4b33-4db5-991f-9a4b3aee84ae\") " Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.516204 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-dns-svc\") pod \"db79aaa9-4b33-4db5-991f-9a4b3aee84ae\" (UID: \"db79aaa9-4b33-4db5-991f-9a4b3aee84ae\") " Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.517813 4743 scope.go:117] "RemoveContainer" containerID="3818a0a0738abe4adae403a3161611c5df4158ff3518ae3e343826dc49bd2be2" Nov 25 16:15:03 crc kubenswrapper[4743]: E1125 16:15:03.521431 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3818a0a0738abe4adae403a3161611c5df4158ff3518ae3e343826dc49bd2be2\": container with ID starting with 3818a0a0738abe4adae403a3161611c5df4158ff3518ae3e343826dc49bd2be2 not found: ID does not exist" containerID="3818a0a0738abe4adae403a3161611c5df4158ff3518ae3e343826dc49bd2be2" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.521483 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3818a0a0738abe4adae403a3161611c5df4158ff3518ae3e343826dc49bd2be2"} err="failed to get container status \"3818a0a0738abe4adae403a3161611c5df4158ff3518ae3e343826dc49bd2be2\": rpc error: code = NotFound desc = could not find container \"3818a0a0738abe4adae403a3161611c5df4158ff3518ae3e343826dc49bd2be2\": container with ID starting with 3818a0a0738abe4adae403a3161611c5df4158ff3518ae3e343826dc49bd2be2 not found: ID does not exist" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.521513 4743 scope.go:117] "RemoveContainer" containerID="d1bbc4b02ffca1c683e7ac9994288b910d7a5750bac721b4fd1b54711d7a1e6e" Nov 25 16:15:03 crc kubenswrapper[4743]: E1125 16:15:03.522000 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1bbc4b02ffca1c683e7ac9994288b910d7a5750bac721b4fd1b54711d7a1e6e\": container with ID starting with d1bbc4b02ffca1c683e7ac9994288b910d7a5750bac721b4fd1b54711d7a1e6e not found: ID does not exist" containerID="d1bbc4b02ffca1c683e7ac9994288b910d7a5750bac721b4fd1b54711d7a1e6e" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.522035 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1bbc4b02ffca1c683e7ac9994288b910d7a5750bac721b4fd1b54711d7a1e6e"} err="failed to get container status \"d1bbc4b02ffca1c683e7ac9994288b910d7a5750bac721b4fd1b54711d7a1e6e\": rpc error: code = NotFound desc = could not find container \"d1bbc4b02ffca1c683e7ac9994288b910d7a5750bac721b4fd1b54711d7a1e6e\": container with ID starting with d1bbc4b02ffca1c683e7ac9994288b910d7a5750bac721b4fd1b54711d7a1e6e not found: ID does not exist" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.526927 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-kube-api-access-t4lmb" (OuterVolumeSpecName: "kube-api-access-t4lmb") pod "db79aaa9-4b33-4db5-991f-9a4b3aee84ae" (UID: "db79aaa9-4b33-4db5-991f-9a4b3aee84ae"). InnerVolumeSpecName "kube-api-access-t4lmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.544202 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.327976902 podStartE2EDuration="1m0.544176517s" podCreationTimestamp="2025-11-25 16:14:03 +0000 UTC" firstStartedPulling="2025-11-25 16:14:17.264988146 +0000 UTC m=+936.386827695" lastFinishedPulling="2025-11-25 16:14:27.481187761 +0000 UTC m=+946.603027310" observedRunningTime="2025-11-25 16:15:03.539506261 +0000 UTC m=+982.661345830" watchObservedRunningTime="2025-11-25 16:15:03.544176517 +0000 UTC m=+982.666016066" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.557916 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-config" (OuterVolumeSpecName: "config") pod "db79aaa9-4b33-4db5-991f-9a4b3aee84ae" (UID: "db79aaa9-4b33-4db5-991f-9a4b3aee84ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.571827 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "db79aaa9-4b33-4db5-991f-9a4b3aee84ae" (UID: "db79aaa9-4b33-4db5-991f-9a4b3aee84ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.619358 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4lmb\" (UniqueName: \"kubernetes.io/projected/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-kube-api-access-t4lmb\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.619396 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.619412 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db79aaa9-4b33-4db5-991f-9a4b3aee84ae-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.791217 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8jpbt"] Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.807741 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8jpbt"] Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.900971 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.923122 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-secret-volume\") pod \"d70c16b3-cf20-4bfe-a816-00d5fd9c2885\" (UID: \"d70c16b3-cf20-4bfe-a816-00d5fd9c2885\") " Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.923184 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsnfl\" (UniqueName: \"kubernetes.io/projected/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-kube-api-access-lsnfl\") pod \"d70c16b3-cf20-4bfe-a816-00d5fd9c2885\" (UID: \"d70c16b3-cf20-4bfe-a816-00d5fd9c2885\") " Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.923286 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-config-volume\") pod \"d70c16b3-cf20-4bfe-a816-00d5fd9c2885\" (UID: \"d70c16b3-cf20-4bfe-a816-00d5fd9c2885\") " Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.923964 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-config-volume" (OuterVolumeSpecName: "config-volume") pod "d70c16b3-cf20-4bfe-a816-00d5fd9c2885" (UID: "d70c16b3-cf20-4bfe-a816-00d5fd9c2885"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.928724 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d70c16b3-cf20-4bfe-a816-00d5fd9c2885" (UID: "d70c16b3-cf20-4bfe-a816-00d5fd9c2885"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:15:03 crc kubenswrapper[4743]: I1125 16:15:03.936835 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-kube-api-access-lsnfl" (OuterVolumeSpecName: "kube-api-access-lsnfl") pod "d70c16b3-cf20-4bfe-a816-00d5fd9c2885" (UID: "d70c16b3-cf20-4bfe-a816-00d5fd9c2885"). InnerVolumeSpecName "kube-api-access-lsnfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.026338 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.026396 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsnfl\" (UniqueName: \"kubernetes.io/projected/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-kube-api-access-lsnfl\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.026407 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d70c16b3-cf20-4bfe-a816-00d5fd9c2885-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.311714 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8dtsl" podUID="7750901a-7566-4d94-8cb5-5aff66e22116" containerName="ovn-controller" probeResult="failure" output=< Nov 25 16:15:04 crc kubenswrapper[4743]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 25 16:15:04 crc kubenswrapper[4743]: > Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.388914 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.390096 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-lmnwx" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.484064 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" event={"ID":"d70c16b3-cf20-4bfe-a816-00d5fd9c2885","Type":"ContainerDied","Data":"cdb0eba7780e4620217b0d9294a156cee715bcd11def550253ca0d827ae95954"} Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.484119 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdb0eba7780e4620217b0d9294a156cee715bcd11def550253ca0d827ae95954" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.484169 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.625150 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8dtsl-config-ss69q"] Nov 25 16:15:04 crc kubenswrapper[4743]: E1125 16:15:04.625744 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a21ef5d-71ae-47ce-aa24-d49830887317" containerName="mariadb-account-create" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.625770 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a21ef5d-71ae-47ce-aa24-d49830887317" containerName="mariadb-account-create" Nov 25 16:15:04 crc kubenswrapper[4743]: E1125 16:15:04.625794 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5eff179-2afc-4ec2-addc-31c3c36a6fd7" containerName="swift-ring-rebalance" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.625803 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5eff179-2afc-4ec2-addc-31c3c36a6fd7" containerName="swift-ring-rebalance" Nov 25 16:15:04 crc kubenswrapper[4743]: E1125 16:15:04.625812 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db79aaa9-4b33-4db5-991f-9a4b3aee84ae" containerName="dnsmasq-dns" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.625820 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="db79aaa9-4b33-4db5-991f-9a4b3aee84ae" containerName="dnsmasq-dns" Nov 25 16:15:04 crc kubenswrapper[4743]: E1125 16:15:04.625832 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d70c16b3-cf20-4bfe-a816-00d5fd9c2885" containerName="collect-profiles" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.625839 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d70c16b3-cf20-4bfe-a816-00d5fd9c2885" containerName="collect-profiles" Nov 25 16:15:04 crc kubenswrapper[4743]: E1125 16:15:04.625872 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ceca1a3-ffa3-4b15-bca5-c306720941f3" containerName="mariadb-database-create" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.625900 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ceca1a3-ffa3-4b15-bca5-c306720941f3" containerName="mariadb-database-create" Nov 25 16:15:04 crc kubenswrapper[4743]: E1125 16:15:04.625913 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5903c1f3-91d8-4b0a-9680-d0f547df08c2" containerName="mariadb-database-create" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.625922 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5903c1f3-91d8-4b0a-9680-d0f547df08c2" containerName="mariadb-database-create" Nov 25 16:15:04 crc kubenswrapper[4743]: E1125 16:15:04.625938 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="436fe732-731a-4cb7-85d6-2d4e3ef48805" containerName="mariadb-account-create" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.625946 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="436fe732-731a-4cb7-85d6-2d4e3ef48805" containerName="mariadb-account-create" Nov 25 16:15:04 crc kubenswrapper[4743]: E1125 16:15:04.625959 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db79aaa9-4b33-4db5-991f-9a4b3aee84ae" containerName="init" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.625967 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="db79aaa9-4b33-4db5-991f-9a4b3aee84ae" containerName="init" Nov 25 16:15:04 crc kubenswrapper[4743]: E1125 16:15:04.625984 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ae1e82-cea4-4ab3-9cc7-7b2615288871" containerName="mariadb-database-create" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.625992 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ae1e82-cea4-4ab3-9cc7-7b2615288871" containerName="mariadb-database-create" Nov 25 16:15:04 crc kubenswrapper[4743]: E1125 16:15:04.626000 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f" containerName="mariadb-account-create" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.626008 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f" containerName="mariadb-account-create" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.626216 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ae1e82-cea4-4ab3-9cc7-7b2615288871" containerName="mariadb-database-create" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.626231 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="436fe732-731a-4cb7-85d6-2d4e3ef48805" containerName="mariadb-account-create" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.626242 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f" containerName="mariadb-account-create" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.626250 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d70c16b3-cf20-4bfe-a816-00d5fd9c2885" containerName="collect-profiles" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.626264 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a21ef5d-71ae-47ce-aa24-d49830887317" containerName="mariadb-account-create" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.626273 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5eff179-2afc-4ec2-addc-31c3c36a6fd7" containerName="swift-ring-rebalance" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.626281 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="db79aaa9-4b33-4db5-991f-9a4b3aee84ae" containerName="dnsmasq-dns" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.626289 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ceca1a3-ffa3-4b15-bca5-c306720941f3" containerName="mariadb-database-create" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.626300 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5903c1f3-91d8-4b0a-9680-d0f547df08c2" containerName="mariadb-database-create" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.627000 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.632868 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8dtsl-config-ss69q"] Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.633688 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.654296 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8bf7\" (UniqueName: \"kubernetes.io/projected/34ef52d0-52e9-4fc9-97c3-94d5027bd566-kube-api-access-r8bf7\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.654493 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/34ef52d0-52e9-4fc9-97c3-94d5027bd566-additional-scripts\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.654647 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-log-ovn\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.654698 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-run\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.655050 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-run-ovn\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.655241 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34ef52d0-52e9-4fc9-97c3-94d5027bd566-scripts\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.756386 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34ef52d0-52e9-4fc9-97c3-94d5027bd566-scripts\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.758692 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8bf7\" (UniqueName: \"kubernetes.io/projected/34ef52d0-52e9-4fc9-97c3-94d5027bd566-kube-api-access-r8bf7\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.758619 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34ef52d0-52e9-4fc9-97c3-94d5027bd566-scripts\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.758773 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/34ef52d0-52e9-4fc9-97c3-94d5027bd566-additional-scripts\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.758848 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-log-ovn\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.758875 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-run\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.759115 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-run-ovn\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.759159 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-run\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.759190 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-log-ovn\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.759322 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-run-ovn\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.759485 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/34ef52d0-52e9-4fc9-97c3-94d5027bd566-additional-scripts\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.779549 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8bf7\" (UniqueName: \"kubernetes.io/projected/34ef52d0-52e9-4fc9-97c3-94d5027bd566-kube-api-access-r8bf7\") pod \"ovn-controller-8dtsl-config-ss69q\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:04 crc kubenswrapper[4743]: I1125 16:15:04.947825 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:05 crc kubenswrapper[4743]: I1125 16:15:05.386797 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8dtsl-config-ss69q"] Nov 25 16:15:05 crc kubenswrapper[4743]: I1125 16:15:05.506343 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8dtsl-config-ss69q" event={"ID":"34ef52d0-52e9-4fc9-97c3-94d5027bd566","Type":"ContainerStarted","Data":"ff13743a63d1b6ab52c1ae33e6953210d2512e3a640169d94dba7c277136dce4"} Nov 25 16:15:05 crc kubenswrapper[4743]: I1125 16:15:05.785919 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db79aaa9-4b33-4db5-991f-9a4b3aee84ae" path="/var/lib/kubelet/pods/db79aaa9-4b33-4db5-991f-9a4b3aee84ae/volumes" Nov 25 16:15:06 crc kubenswrapper[4743]: I1125 16:15:06.520342 4743 generic.go:334] "Generic (PLEG): container finished" podID="34ef52d0-52e9-4fc9-97c3-94d5027bd566" containerID="3ae3f9b51fe57af31c27665590f2965eae98b618081c1e9cabe7276689de9c1b" exitCode=0 Nov 25 16:15:06 crc kubenswrapper[4743]: I1125 16:15:06.520420 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8dtsl-config-ss69q" event={"ID":"34ef52d0-52e9-4fc9-97c3-94d5027bd566","Type":"ContainerDied","Data":"3ae3f9b51fe57af31c27665590f2965eae98b618081c1e9cabe7276689de9c1b"} Nov 25 16:15:07 crc kubenswrapper[4743]: I1125 16:15:07.363657 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.112786 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.233063 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-run\") pod \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.233174 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-run" (OuterVolumeSpecName: "var-run") pod "34ef52d0-52e9-4fc9-97c3-94d5027bd566" (UID: "34ef52d0-52e9-4fc9-97c3-94d5027bd566"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.233519 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8bf7\" (UniqueName: \"kubernetes.io/projected/34ef52d0-52e9-4fc9-97c3-94d5027bd566-kube-api-access-r8bf7\") pod \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.233570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-run-ovn\") pod \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.233614 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-log-ovn\") pod \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.233654 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/34ef52d0-52e9-4fc9-97c3-94d5027bd566-additional-scripts\") pod \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.233669 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "34ef52d0-52e9-4fc9-97c3-94d5027bd566" (UID: "34ef52d0-52e9-4fc9-97c3-94d5027bd566"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.233702 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34ef52d0-52e9-4fc9-97c3-94d5027bd566-scripts\") pod \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\" (UID: \"34ef52d0-52e9-4fc9-97c3-94d5027bd566\") " Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.233749 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "34ef52d0-52e9-4fc9-97c3-94d5027bd566" (UID: "34ef52d0-52e9-4fc9-97c3-94d5027bd566"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.234427 4743 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.234455 4743 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.234433 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ef52d0-52e9-4fc9-97c3-94d5027bd566-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "34ef52d0-52e9-4fc9-97c3-94d5027bd566" (UID: "34ef52d0-52e9-4fc9-97c3-94d5027bd566"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.234468 4743 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/34ef52d0-52e9-4fc9-97c3-94d5027bd566-var-run\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.234875 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ef52d0-52e9-4fc9-97c3-94d5027bd566-scripts" (OuterVolumeSpecName: "scripts") pod "34ef52d0-52e9-4fc9-97c3-94d5027bd566" (UID: "34ef52d0-52e9-4fc9-97c3-94d5027bd566"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.253771 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ef52d0-52e9-4fc9-97c3-94d5027bd566-kube-api-access-r8bf7" (OuterVolumeSpecName: "kube-api-access-r8bf7") pod "34ef52d0-52e9-4fc9-97c3-94d5027bd566" (UID: "34ef52d0-52e9-4fc9-97c3-94d5027bd566"). InnerVolumeSpecName "kube-api-access-r8bf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.324230 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-gcflw"] Nov 25 16:15:08 crc kubenswrapper[4743]: E1125 16:15:08.324716 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ef52d0-52e9-4fc9-97c3-94d5027bd566" containerName="ovn-config" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.324735 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ef52d0-52e9-4fc9-97c3-94d5027bd566" containerName="ovn-config" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.324933 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ef52d0-52e9-4fc9-97c3-94d5027bd566" containerName="ovn-config" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.325642 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.329212 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bdp5k" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.329927 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.335905 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8bf7\" (UniqueName: \"kubernetes.io/projected/34ef52d0-52e9-4fc9-97c3-94d5027bd566-kube-api-access-r8bf7\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.335943 4743 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/34ef52d0-52e9-4fc9-97c3-94d5027bd566-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.335959 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34ef52d0-52e9-4fc9-97c3-94d5027bd566-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.357485 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gcflw"] Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.437418 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-combined-ca-bundle\") pod \"glance-db-sync-gcflw\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.437499 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-config-data\") pod \"glance-db-sync-gcflw\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.437568 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-db-sync-config-data\") pod \"glance-db-sync-gcflw\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.437616 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcdhf\" (UniqueName: \"kubernetes.io/projected/282877d5-d175-44e4-a9da-3253dd8c4d95-kube-api-access-lcdhf\") pod \"glance-db-sync-gcflw\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.540019 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-config-data\") pod \"glance-db-sync-gcflw\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.540122 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-db-sync-config-data\") pod \"glance-db-sync-gcflw\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.540154 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcdhf\" (UniqueName: \"kubernetes.io/projected/282877d5-d175-44e4-a9da-3253dd8c4d95-kube-api-access-lcdhf\") pod \"glance-db-sync-gcflw\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.540195 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-combined-ca-bundle\") pod \"glance-db-sync-gcflw\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.544758 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-config-data\") pod \"glance-db-sync-gcflw\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.544809 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-db-sync-config-data\") pod \"glance-db-sync-gcflw\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.546557 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-combined-ca-bundle\") pod \"glance-db-sync-gcflw\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.560179 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcdhf\" (UniqueName: \"kubernetes.io/projected/282877d5-d175-44e4-a9da-3253dd8c4d95-kube-api-access-lcdhf\") pod \"glance-db-sync-gcflw\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.568264 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8dtsl-config-ss69q" event={"ID":"34ef52d0-52e9-4fc9-97c3-94d5027bd566","Type":"ContainerDied","Data":"ff13743a63d1b6ab52c1ae33e6953210d2512e3a640169d94dba7c277136dce4"} Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.568529 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff13743a63d1b6ab52c1ae33e6953210d2512e3a640169d94dba7c277136dce4" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.568303 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8dtsl-config-ss69q" Nov 25 16:15:08 crc kubenswrapper[4743]: I1125 16:15:08.644083 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:09 crc kubenswrapper[4743]: I1125 16:15:09.233689 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-8dtsl-config-ss69q"] Nov 25 16:15:09 crc kubenswrapper[4743]: I1125 16:15:09.241261 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-8dtsl-config-ss69q"] Nov 25 16:15:09 crc kubenswrapper[4743]: I1125 16:15:09.277932 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gcflw"] Nov 25 16:15:09 crc kubenswrapper[4743]: W1125 16:15:09.308284 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod282877d5_d175_44e4_a9da_3253dd8c4d95.slice/crio-e45aef583fc72255960d987f75906ee47f4d85280f59ef96b451cc14545751bf WatchSource:0}: Error finding container e45aef583fc72255960d987f75906ee47f4d85280f59ef96b451cc14545751bf: Status 404 returned error can't find the container with id e45aef583fc72255960d987f75906ee47f4d85280f59ef96b451cc14545751bf Nov 25 16:15:09 crc kubenswrapper[4743]: I1125 16:15:09.322970 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-8dtsl" Nov 25 16:15:09 crc kubenswrapper[4743]: I1125 16:15:09.582378 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gcflw" event={"ID":"282877d5-d175-44e4-a9da-3253dd8c4d95","Type":"ContainerStarted","Data":"e45aef583fc72255960d987f75906ee47f4d85280f59ef96b451cc14545751bf"} Nov 25 16:15:09 crc kubenswrapper[4743]: I1125 16:15:09.786043 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ef52d0-52e9-4fc9-97c3-94d5027bd566" path="/var/lib/kubelet/pods/34ef52d0-52e9-4fc9-97c3-94d5027bd566/volumes" Nov 25 16:15:13 crc kubenswrapper[4743]: I1125 16:15:13.931426 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:15:13 crc kubenswrapper[4743]: I1125 16:15:13.959992 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9ae66928-3c05-4597-98a6-f663e9df7cff-etc-swift\") pod \"swift-storage-0\" (UID: \"9ae66928-3c05-4597-98a6-f663e9df7cff\") " pod="openstack/swift-storage-0" Nov 25 16:15:14 crc kubenswrapper[4743]: I1125 16:15:14.239353 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 25 16:15:14 crc kubenswrapper[4743]: I1125 16:15:14.647834 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 25 16:15:14 crc kubenswrapper[4743]: I1125 16:15:14.905803 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:15:14 crc kubenswrapper[4743]: I1125 16:15:14.930616 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-hkb7f"] Nov 25 16:15:14 crc kubenswrapper[4743]: I1125 16:15:14.931949 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hkb7f" Nov 25 16:15:14 crc kubenswrapper[4743]: I1125 16:15:14.952505 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hkb7f"] Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.046717 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-rjl8d"] Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.048255 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rjl8d" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.055404 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bx9p\" (UniqueName: \"kubernetes.io/projected/dad1f368-dada-4a6b-8060-9aed7b85a828-kube-api-access-8bx9p\") pod \"cinder-db-create-hkb7f\" (UID: \"dad1f368-dada-4a6b-8060-9aed7b85a828\") " pod="openstack/cinder-db-create-hkb7f" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.055444 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dad1f368-dada-4a6b-8060-9aed7b85a828-operator-scripts\") pod \"cinder-db-create-hkb7f\" (UID: \"dad1f368-dada-4a6b-8060-9aed7b85a828\") " pod="openstack/cinder-db-create-hkb7f" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.062551 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rjl8d"] Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.135108 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6edd-account-create-4mchv"] Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.136340 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6edd-account-create-4mchv" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.143128 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.156711 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/197ff5f2-2482-4735-8a0d-8b77ed613724-operator-scripts\") pod \"barbican-db-create-rjl8d\" (UID: \"197ff5f2-2482-4735-8a0d-8b77ed613724\") " pod="openstack/barbican-db-create-rjl8d" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.156758 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7xrx\" (UniqueName: \"kubernetes.io/projected/9f77a2c7-d59b-43a4-9ffb-4edd475a57eb-kube-api-access-w7xrx\") pod \"barbican-6edd-account-create-4mchv\" (UID: \"9f77a2c7-d59b-43a4-9ffb-4edd475a57eb\") " pod="openstack/barbican-6edd-account-create-4mchv" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.156845 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bx9p\" (UniqueName: \"kubernetes.io/projected/dad1f368-dada-4a6b-8060-9aed7b85a828-kube-api-access-8bx9p\") pod \"cinder-db-create-hkb7f\" (UID: \"dad1f368-dada-4a6b-8060-9aed7b85a828\") " pod="openstack/cinder-db-create-hkb7f" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.156868 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dad1f368-dada-4a6b-8060-9aed7b85a828-operator-scripts\") pod \"cinder-db-create-hkb7f\" (UID: \"dad1f368-dada-4a6b-8060-9aed7b85a828\") " pod="openstack/cinder-db-create-hkb7f" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.156908 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f77a2c7-d59b-43a4-9ffb-4edd475a57eb-operator-scripts\") pod \"barbican-6edd-account-create-4mchv\" (UID: \"9f77a2c7-d59b-43a4-9ffb-4edd475a57eb\") " pod="openstack/barbican-6edd-account-create-4mchv" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.156938 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qcr5\" (UniqueName: \"kubernetes.io/projected/197ff5f2-2482-4735-8a0d-8b77ed613724-kube-api-access-8qcr5\") pod \"barbican-db-create-rjl8d\" (UID: \"197ff5f2-2482-4735-8a0d-8b77ed613724\") " pod="openstack/barbican-db-create-rjl8d" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.157755 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6edd-account-create-4mchv"] Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.157936 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dad1f368-dada-4a6b-8060-9aed7b85a828-operator-scripts\") pod \"cinder-db-create-hkb7f\" (UID: \"dad1f368-dada-4a6b-8060-9aed7b85a828\") " pod="openstack/cinder-db-create-hkb7f" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.183623 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bx9p\" (UniqueName: \"kubernetes.io/projected/dad1f368-dada-4a6b-8060-9aed7b85a828-kube-api-access-8bx9p\") pod \"cinder-db-create-hkb7f\" (UID: \"dad1f368-dada-4a6b-8060-9aed7b85a828\") " pod="openstack/cinder-db-create-hkb7f" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.241448 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-27cb-account-create-d8dqk"] Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.249776 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-27cb-account-create-d8dqk" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.253668 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-27cb-account-create-d8dqk"] Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.254044 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hkb7f" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.257137 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.258052 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f77a2c7-d59b-43a4-9ffb-4edd475a57eb-operator-scripts\") pod \"barbican-6edd-account-create-4mchv\" (UID: \"9f77a2c7-d59b-43a4-9ffb-4edd475a57eb\") " pod="openstack/barbican-6edd-account-create-4mchv" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.258096 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf54n\" (UniqueName: \"kubernetes.io/projected/d4715ed9-7cbe-4519-84fd-db690a43a69f-kube-api-access-jf54n\") pod \"cinder-27cb-account-create-d8dqk\" (UID: \"d4715ed9-7cbe-4519-84fd-db690a43a69f\") " pod="openstack/cinder-27cb-account-create-d8dqk" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.258129 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qcr5\" (UniqueName: \"kubernetes.io/projected/197ff5f2-2482-4735-8a0d-8b77ed613724-kube-api-access-8qcr5\") pod \"barbican-db-create-rjl8d\" (UID: \"197ff5f2-2482-4735-8a0d-8b77ed613724\") " pod="openstack/barbican-db-create-rjl8d" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.258155 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4715ed9-7cbe-4519-84fd-db690a43a69f-operator-scripts\") pod \"cinder-27cb-account-create-d8dqk\" (UID: \"d4715ed9-7cbe-4519-84fd-db690a43a69f\") " pod="openstack/cinder-27cb-account-create-d8dqk" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.258192 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/197ff5f2-2482-4735-8a0d-8b77ed613724-operator-scripts\") pod \"barbican-db-create-rjl8d\" (UID: \"197ff5f2-2482-4735-8a0d-8b77ed613724\") " pod="openstack/barbican-db-create-rjl8d" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.258209 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7xrx\" (UniqueName: \"kubernetes.io/projected/9f77a2c7-d59b-43a4-9ffb-4edd475a57eb-kube-api-access-w7xrx\") pod \"barbican-6edd-account-create-4mchv\" (UID: \"9f77a2c7-d59b-43a4-9ffb-4edd475a57eb\") " pod="openstack/barbican-6edd-account-create-4mchv" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.259153 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f77a2c7-d59b-43a4-9ffb-4edd475a57eb-operator-scripts\") pod \"barbican-6edd-account-create-4mchv\" (UID: \"9f77a2c7-d59b-43a4-9ffb-4edd475a57eb\") " pod="openstack/barbican-6edd-account-create-4mchv" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.259310 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/197ff5f2-2482-4735-8a0d-8b77ed613724-operator-scripts\") pod \"barbican-db-create-rjl8d\" (UID: \"197ff5f2-2482-4735-8a0d-8b77ed613724\") " pod="openstack/barbican-db-create-rjl8d" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.277077 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7xrx\" (UniqueName: \"kubernetes.io/projected/9f77a2c7-d59b-43a4-9ffb-4edd475a57eb-kube-api-access-w7xrx\") pod \"barbican-6edd-account-create-4mchv\" (UID: \"9f77a2c7-d59b-43a4-9ffb-4edd475a57eb\") " pod="openstack/barbican-6edd-account-create-4mchv" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.284006 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qcr5\" (UniqueName: \"kubernetes.io/projected/197ff5f2-2482-4735-8a0d-8b77ed613724-kube-api-access-8qcr5\") pod \"barbican-db-create-rjl8d\" (UID: \"197ff5f2-2482-4735-8a0d-8b77ed613724\") " pod="openstack/barbican-db-create-rjl8d" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.315787 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-cfq9v"] Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.316962 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cfq9v" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.320517 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-znjss" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.320787 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.320927 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.321641 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.328291 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cfq9v"] Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.359900 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548ce1d7-a489-4a29-9e63-56b28e48f7e1-combined-ca-bundle\") pod \"keystone-db-sync-cfq9v\" (UID: \"548ce1d7-a489-4a29-9e63-56b28e48f7e1\") " pod="openstack/keystone-db-sync-cfq9v" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.359962 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/548ce1d7-a489-4a29-9e63-56b28e48f7e1-config-data\") pod \"keystone-db-sync-cfq9v\" (UID: \"548ce1d7-a489-4a29-9e63-56b28e48f7e1\") " pod="openstack/keystone-db-sync-cfq9v" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.360018 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf54n\" (UniqueName: \"kubernetes.io/projected/d4715ed9-7cbe-4519-84fd-db690a43a69f-kube-api-access-jf54n\") pod \"cinder-27cb-account-create-d8dqk\" (UID: \"d4715ed9-7cbe-4519-84fd-db690a43a69f\") " pod="openstack/cinder-27cb-account-create-d8dqk" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.360125 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls4bc\" (UniqueName: \"kubernetes.io/projected/548ce1d7-a489-4a29-9e63-56b28e48f7e1-kube-api-access-ls4bc\") pod \"keystone-db-sync-cfq9v\" (UID: \"548ce1d7-a489-4a29-9e63-56b28e48f7e1\") " pod="openstack/keystone-db-sync-cfq9v" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.360175 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4715ed9-7cbe-4519-84fd-db690a43a69f-operator-scripts\") pod \"cinder-27cb-account-create-d8dqk\" (UID: \"d4715ed9-7cbe-4519-84fd-db690a43a69f\") " pod="openstack/cinder-27cb-account-create-d8dqk" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.360946 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4715ed9-7cbe-4519-84fd-db690a43a69f-operator-scripts\") pod \"cinder-27cb-account-create-d8dqk\" (UID: \"d4715ed9-7cbe-4519-84fd-db690a43a69f\") " pod="openstack/cinder-27cb-account-create-d8dqk" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.364215 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rjl8d" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.376655 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf54n\" (UniqueName: \"kubernetes.io/projected/d4715ed9-7cbe-4519-84fd-db690a43a69f-kube-api-access-jf54n\") pod \"cinder-27cb-account-create-d8dqk\" (UID: \"d4715ed9-7cbe-4519-84fd-db690a43a69f\") " pod="openstack/cinder-27cb-account-create-d8dqk" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.427533 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-scfvn"] Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.428792 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-scfvn" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.435962 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f84a-account-create-9rs6f"] Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.437035 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f84a-account-create-9rs6f" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.442504 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.462007 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6edd-account-create-4mchv" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.484375 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-scfvn"] Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.491321 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548ce1d7-a489-4a29-9e63-56b28e48f7e1-combined-ca-bundle\") pod \"keystone-db-sync-cfq9v\" (UID: \"548ce1d7-a489-4a29-9e63-56b28e48f7e1\") " pod="openstack/keystone-db-sync-cfq9v" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.491394 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/548ce1d7-a489-4a29-9e63-56b28e48f7e1-config-data\") pod \"keystone-db-sync-cfq9v\" (UID: \"548ce1d7-a489-4a29-9e63-56b28e48f7e1\") " pod="openstack/keystone-db-sync-cfq9v" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.491433 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p68q\" (UniqueName: \"kubernetes.io/projected/bb815cd5-463d-4056-b4af-e9c6cf18a9c7-kube-api-access-5p68q\") pod \"neutron-db-create-scfvn\" (UID: \"bb815cd5-463d-4056-b4af-e9c6cf18a9c7\") " pod="openstack/neutron-db-create-scfvn" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.491491 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mfdr\" (UniqueName: \"kubernetes.io/projected/d466b91c-227b-424c-a235-7ef50de97f94-kube-api-access-7mfdr\") pod \"neutron-f84a-account-create-9rs6f\" (UID: \"d466b91c-227b-424c-a235-7ef50de97f94\") " pod="openstack/neutron-f84a-account-create-9rs6f" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.491512 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d466b91c-227b-424c-a235-7ef50de97f94-operator-scripts\") pod \"neutron-f84a-account-create-9rs6f\" (UID: \"d466b91c-227b-424c-a235-7ef50de97f94\") " pod="openstack/neutron-f84a-account-create-9rs6f" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.491547 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls4bc\" (UniqueName: \"kubernetes.io/projected/548ce1d7-a489-4a29-9e63-56b28e48f7e1-kube-api-access-ls4bc\") pod \"keystone-db-sync-cfq9v\" (UID: \"548ce1d7-a489-4a29-9e63-56b28e48f7e1\") " pod="openstack/keystone-db-sync-cfq9v" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.491611 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb815cd5-463d-4056-b4af-e9c6cf18a9c7-operator-scripts\") pod \"neutron-db-create-scfvn\" (UID: \"bb815cd5-463d-4056-b4af-e9c6cf18a9c7\") " pod="openstack/neutron-db-create-scfvn" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.497759 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548ce1d7-a489-4a29-9e63-56b28e48f7e1-combined-ca-bundle\") pod \"keystone-db-sync-cfq9v\" (UID: \"548ce1d7-a489-4a29-9e63-56b28e48f7e1\") " pod="openstack/keystone-db-sync-cfq9v" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.507217 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f84a-account-create-9rs6f"] Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.513208 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/548ce1d7-a489-4a29-9e63-56b28e48f7e1-config-data\") pod \"keystone-db-sync-cfq9v\" (UID: \"548ce1d7-a489-4a29-9e63-56b28e48f7e1\") " pod="openstack/keystone-db-sync-cfq9v" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.522444 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls4bc\" (UniqueName: \"kubernetes.io/projected/548ce1d7-a489-4a29-9e63-56b28e48f7e1-kube-api-access-ls4bc\") pod \"keystone-db-sync-cfq9v\" (UID: \"548ce1d7-a489-4a29-9e63-56b28e48f7e1\") " pod="openstack/keystone-db-sync-cfq9v" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.594152 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p68q\" (UniqueName: \"kubernetes.io/projected/bb815cd5-463d-4056-b4af-e9c6cf18a9c7-kube-api-access-5p68q\") pod \"neutron-db-create-scfvn\" (UID: \"bb815cd5-463d-4056-b4af-e9c6cf18a9c7\") " pod="openstack/neutron-db-create-scfvn" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.594211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mfdr\" (UniqueName: \"kubernetes.io/projected/d466b91c-227b-424c-a235-7ef50de97f94-kube-api-access-7mfdr\") pod \"neutron-f84a-account-create-9rs6f\" (UID: \"d466b91c-227b-424c-a235-7ef50de97f94\") " pod="openstack/neutron-f84a-account-create-9rs6f" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.594232 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d466b91c-227b-424c-a235-7ef50de97f94-operator-scripts\") pod \"neutron-f84a-account-create-9rs6f\" (UID: \"d466b91c-227b-424c-a235-7ef50de97f94\") " pod="openstack/neutron-f84a-account-create-9rs6f" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.594274 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb815cd5-463d-4056-b4af-e9c6cf18a9c7-operator-scripts\") pod \"neutron-db-create-scfvn\" (UID: \"bb815cd5-463d-4056-b4af-e9c6cf18a9c7\") " pod="openstack/neutron-db-create-scfvn" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.595898 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d466b91c-227b-424c-a235-7ef50de97f94-operator-scripts\") pod \"neutron-f84a-account-create-9rs6f\" (UID: \"d466b91c-227b-424c-a235-7ef50de97f94\") " pod="openstack/neutron-f84a-account-create-9rs6f" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.596390 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb815cd5-463d-4056-b4af-e9c6cf18a9c7-operator-scripts\") pod \"neutron-db-create-scfvn\" (UID: \"bb815cd5-463d-4056-b4af-e9c6cf18a9c7\") " pod="openstack/neutron-db-create-scfvn" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.610746 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p68q\" (UniqueName: \"kubernetes.io/projected/bb815cd5-463d-4056-b4af-e9c6cf18a9c7-kube-api-access-5p68q\") pod \"neutron-db-create-scfvn\" (UID: \"bb815cd5-463d-4056-b4af-e9c6cf18a9c7\") " pod="openstack/neutron-db-create-scfvn" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.612030 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mfdr\" (UniqueName: \"kubernetes.io/projected/d466b91c-227b-424c-a235-7ef50de97f94-kube-api-access-7mfdr\") pod \"neutron-f84a-account-create-9rs6f\" (UID: \"d466b91c-227b-424c-a235-7ef50de97f94\") " pod="openstack/neutron-f84a-account-create-9rs6f" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.656032 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-27cb-account-create-d8dqk" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.673614 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cfq9v" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.754535 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-scfvn" Nov 25 16:15:15 crc kubenswrapper[4743]: I1125 16:15:15.768501 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f84a-account-create-9rs6f" Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.420915 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6edd-account-create-4mchv"] Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.434321 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-scfvn"] Nov 25 16:15:21 crc kubenswrapper[4743]: W1125 16:15:21.437434 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f77a2c7_d59b_43a4_9ffb_4edd475a57eb.slice/crio-6b12b45329615a5c56248ee4013cb703c579854316fe197b1cb623121a37101d WatchSource:0}: Error finding container 6b12b45329615a5c56248ee4013cb703c579854316fe197b1cb623121a37101d: Status 404 returned error can't find the container with id 6b12b45329615a5c56248ee4013cb703c579854316fe197b1cb623121a37101d Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.557753 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cfq9v"] Nov 25 16:15:21 crc kubenswrapper[4743]: W1125 16:15:21.562564 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod548ce1d7_a489_4a29_9e63_56b28e48f7e1.slice/crio-a20afdce0388b8d629ff70ff70d0b6c0f3bf899cc3d6b2a81e7476ce7872c856 WatchSource:0}: Error finding container a20afdce0388b8d629ff70ff70d0b6c0f3bf899cc3d6b2a81e7476ce7872c856: Status 404 returned error can't find the container with id a20afdce0388b8d629ff70ff70d0b6c0f3bf899cc3d6b2a81e7476ce7872c856 Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.565138 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-27cb-account-create-d8dqk"] Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.572522 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hkb7f"] Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.581768 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rjl8d"] Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.654132 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.694083 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rjl8d" event={"ID":"197ff5f2-2482-4735-8a0d-8b77ed613724","Type":"ContainerStarted","Data":"83c3a7abbd4317a481a70e9b74ca0aa982ee48d5c144dfb88f0001d598a71d15"} Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.695111 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cfq9v" event={"ID":"548ce1d7-a489-4a29-9e63-56b28e48f7e1","Type":"ContainerStarted","Data":"a20afdce0388b8d629ff70ff70d0b6c0f3bf899cc3d6b2a81e7476ce7872c856"} Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.696232 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-27cb-account-create-d8dqk" event={"ID":"d4715ed9-7cbe-4519-84fd-db690a43a69f","Type":"ContainerStarted","Data":"1b1fc8680859004d3e8b5c742bd8c750abe93014db28db7cfb60477f25849a2b"} Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.697636 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"bf463d6f7c88146b655a1be2d07feacda3f5ce00eee7acc480eec323a40aba80"} Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.700642 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gcflw" event={"ID":"282877d5-d175-44e4-a9da-3253dd8c4d95","Type":"ContainerStarted","Data":"6e81e903379dd1bb42887a7fcb498d24ffdc267ba0f44ec87102ff49c3e5b2d1"} Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.704436 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6edd-account-create-4mchv" event={"ID":"9f77a2c7-d59b-43a4-9ffb-4edd475a57eb","Type":"ContainerStarted","Data":"7d00eebe9af657c37ba2cb40280dee2d66b7d136106621bc29e6778f8f44a92c"} Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.704508 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6edd-account-create-4mchv" event={"ID":"9f77a2c7-d59b-43a4-9ffb-4edd475a57eb","Type":"ContainerStarted","Data":"6b12b45329615a5c56248ee4013cb703c579854316fe197b1cb623121a37101d"} Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.712644 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-scfvn" event={"ID":"bb815cd5-463d-4056-b4af-e9c6cf18a9c7","Type":"ContainerStarted","Data":"033fef5562d777e22b51c09e80c321befea9c2bbdc6b911c6bfabbb7fd4ab60f"} Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.712718 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-scfvn" event={"ID":"bb815cd5-463d-4056-b4af-e9c6cf18a9c7","Type":"ContainerStarted","Data":"62912b31a8c116bdbe13f2bedb329e9cffebf710ef5c76e22e3027781e1cb22f"} Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.716697 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hkb7f" event={"ID":"dad1f368-dada-4a6b-8060-9aed7b85a828","Type":"ContainerStarted","Data":"1821440e4c89510703328ae6ddd6ab811ea3457dd2644a3453909412b80d02f3"} Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.724688 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-gcflw" podStartSLOduration=2.124963951 podStartE2EDuration="13.72466554s" podCreationTimestamp="2025-11-25 16:15:08 +0000 UTC" firstStartedPulling="2025-11-25 16:15:09.311521581 +0000 UTC m=+988.433361130" lastFinishedPulling="2025-11-25 16:15:20.91122318 +0000 UTC m=+1000.033062719" observedRunningTime="2025-11-25 16:15:21.721178871 +0000 UTC m=+1000.843018420" watchObservedRunningTime="2025-11-25 16:15:21.72466554 +0000 UTC m=+1000.846505089" Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.738901 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-scfvn" podStartSLOduration=6.738882296 podStartE2EDuration="6.738882296s" podCreationTimestamp="2025-11-25 16:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:15:21.737555765 +0000 UTC m=+1000.859395354" watchObservedRunningTime="2025-11-25 16:15:21.738882296 +0000 UTC m=+1000.860721845" Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.759644 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f84a-account-create-9rs6f"] Nov 25 16:15:21 crc kubenswrapper[4743]: I1125 16:15:21.759794 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-6edd-account-create-4mchv" podStartSLOduration=6.759777273 podStartE2EDuration="6.759777273s" podCreationTimestamp="2025-11-25 16:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:15:21.752805253 +0000 UTC m=+1000.874644802" watchObservedRunningTime="2025-11-25 16:15:21.759777273 +0000 UTC m=+1000.881616822" Nov 25 16:15:22 crc kubenswrapper[4743]: I1125 16:15:22.724068 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f84a-account-create-9rs6f" event={"ID":"d466b91c-227b-424c-a235-7ef50de97f94","Type":"ContainerStarted","Data":"0d8927e45b70c2d03b6e2c4c1d1d644efa968af62e197c42c679207f8940c293"} Nov 25 16:15:22 crc kubenswrapper[4743]: I1125 16:15:22.724414 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f84a-account-create-9rs6f" event={"ID":"d466b91c-227b-424c-a235-7ef50de97f94","Type":"ContainerStarted","Data":"19b96199d2b4713f5d30e442cf9e5ee9c87c7dfb58ff51c862fb309dbb568f30"} Nov 25 16:15:22 crc kubenswrapper[4743]: I1125 16:15:22.730348 4743 generic.go:334] "Generic (PLEG): container finished" podID="bb815cd5-463d-4056-b4af-e9c6cf18a9c7" containerID="033fef5562d777e22b51c09e80c321befea9c2bbdc6b911c6bfabbb7fd4ab60f" exitCode=0 Nov 25 16:15:22 crc kubenswrapper[4743]: I1125 16:15:22.730436 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-scfvn" event={"ID":"bb815cd5-463d-4056-b4af-e9c6cf18a9c7","Type":"ContainerDied","Data":"033fef5562d777e22b51c09e80c321befea9c2bbdc6b911c6bfabbb7fd4ab60f"} Nov 25 16:15:22 crc kubenswrapper[4743]: I1125 16:15:22.732077 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hkb7f" event={"ID":"dad1f368-dada-4a6b-8060-9aed7b85a828","Type":"ContainerStarted","Data":"c6e837874ff915e5c536412cb3da3cd182850c1c913b1c73f86a43bd12f82377"} Nov 25 16:15:22 crc kubenswrapper[4743]: I1125 16:15:22.733308 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rjl8d" event={"ID":"197ff5f2-2482-4735-8a0d-8b77ed613724","Type":"ContainerStarted","Data":"e946d2beaf2e652e9b437f9f865a3ebb63c21a0d4cc46ac8e5786ba2ea1c6994"} Nov 25 16:15:22 crc kubenswrapper[4743]: I1125 16:15:22.734540 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-27cb-account-create-d8dqk" event={"ID":"d4715ed9-7cbe-4519-84fd-db690a43a69f","Type":"ContainerStarted","Data":"7df8f285b57c3b4402edc15443199ba89cb81950f6462b68f2876c587a38f0ff"} Nov 25 16:15:22 crc kubenswrapper[4743]: I1125 16:15:22.735676 4743 generic.go:334] "Generic (PLEG): container finished" podID="9f77a2c7-d59b-43a4-9ffb-4edd475a57eb" containerID="7d00eebe9af657c37ba2cb40280dee2d66b7d136106621bc29e6778f8f44a92c" exitCode=0 Nov 25 16:15:22 crc kubenswrapper[4743]: I1125 16:15:22.736459 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6edd-account-create-4mchv" event={"ID":"9f77a2c7-d59b-43a4-9ffb-4edd475a57eb","Type":"ContainerDied","Data":"7d00eebe9af657c37ba2cb40280dee2d66b7d136106621bc29e6778f8f44a92c"} Nov 25 16:15:22 crc kubenswrapper[4743]: I1125 16:15:22.741943 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f84a-account-create-9rs6f" podStartSLOduration=7.741926988 podStartE2EDuration="7.741926988s" podCreationTimestamp="2025-11-25 16:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:15:22.74164894 +0000 UTC m=+1001.863488489" watchObservedRunningTime="2025-11-25 16:15:22.741926988 +0000 UTC m=+1001.863766537" Nov 25 16:15:22 crc kubenswrapper[4743]: I1125 16:15:22.773094 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-27cb-account-create-d8dqk" podStartSLOduration=7.773075766 podStartE2EDuration="7.773075766s" podCreationTimestamp="2025-11-25 16:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:15:22.770283329 +0000 UTC m=+1001.892122878" watchObservedRunningTime="2025-11-25 16:15:22.773075766 +0000 UTC m=+1001.894915315" Nov 25 16:15:22 crc kubenswrapper[4743]: I1125 16:15:22.796265 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-rjl8d" podStartSLOduration=7.796248164 podStartE2EDuration="7.796248164s" podCreationTimestamp="2025-11-25 16:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:15:22.79292456 +0000 UTC m=+1001.914764109" watchObservedRunningTime="2025-11-25 16:15:22.796248164 +0000 UTC m=+1001.918087703" Nov 25 16:15:22 crc kubenswrapper[4743]: I1125 16:15:22.824805 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-hkb7f" podStartSLOduration=8.824787809 podStartE2EDuration="8.824787809s" podCreationTimestamp="2025-11-25 16:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:15:22.818580825 +0000 UTC m=+1001.940420384" watchObservedRunningTime="2025-11-25 16:15:22.824787809 +0000 UTC m=+1001.946627368" Nov 25 16:15:23 crc kubenswrapper[4743]: I1125 16:15:23.754411 4743 generic.go:334] "Generic (PLEG): container finished" podID="d4715ed9-7cbe-4519-84fd-db690a43a69f" containerID="7df8f285b57c3b4402edc15443199ba89cb81950f6462b68f2876c587a38f0ff" exitCode=0 Nov 25 16:15:23 crc kubenswrapper[4743]: I1125 16:15:23.754473 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-27cb-account-create-d8dqk" event={"ID":"d4715ed9-7cbe-4519-84fd-db690a43a69f","Type":"ContainerDied","Data":"7df8f285b57c3b4402edc15443199ba89cb81950f6462b68f2876c587a38f0ff"} Nov 25 16:15:23 crc kubenswrapper[4743]: I1125 16:15:23.758633 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"c43f95dc66d11078b05c05f419a10e7408fa01bb47d41b250437493d6973f74c"} Nov 25 16:15:23 crc kubenswrapper[4743]: I1125 16:15:23.760941 4743 generic.go:334] "Generic (PLEG): container finished" podID="d466b91c-227b-424c-a235-7ef50de97f94" containerID="0d8927e45b70c2d03b6e2c4c1d1d644efa968af62e197c42c679207f8940c293" exitCode=0 Nov 25 16:15:23 crc kubenswrapper[4743]: I1125 16:15:23.761006 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f84a-account-create-9rs6f" event={"ID":"d466b91c-227b-424c-a235-7ef50de97f94","Type":"ContainerDied","Data":"0d8927e45b70c2d03b6e2c4c1d1d644efa968af62e197c42c679207f8940c293"} Nov 25 16:15:23 crc kubenswrapper[4743]: I1125 16:15:23.763462 4743 generic.go:334] "Generic (PLEG): container finished" podID="dad1f368-dada-4a6b-8060-9aed7b85a828" containerID="c6e837874ff915e5c536412cb3da3cd182850c1c913b1c73f86a43bd12f82377" exitCode=0 Nov 25 16:15:23 crc kubenswrapper[4743]: I1125 16:15:23.763538 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hkb7f" event={"ID":"dad1f368-dada-4a6b-8060-9aed7b85a828","Type":"ContainerDied","Data":"c6e837874ff915e5c536412cb3da3cd182850c1c913b1c73f86a43bd12f82377"} Nov 25 16:15:23 crc kubenswrapper[4743]: I1125 16:15:23.766397 4743 generic.go:334] "Generic (PLEG): container finished" podID="197ff5f2-2482-4735-8a0d-8b77ed613724" containerID="e946d2beaf2e652e9b437f9f865a3ebb63c21a0d4cc46ac8e5786ba2ea1c6994" exitCode=0 Nov 25 16:15:23 crc kubenswrapper[4743]: I1125 16:15:23.766577 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rjl8d" event={"ID":"197ff5f2-2482-4735-8a0d-8b77ed613724","Type":"ContainerDied","Data":"e946d2beaf2e652e9b437f9f865a3ebb63c21a0d4cc46ac8e5786ba2ea1c6994"} Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.159734 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6edd-account-create-4mchv" Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.168204 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f77a2c7-d59b-43a4-9ffb-4edd475a57eb-operator-scripts\") pod \"9f77a2c7-d59b-43a4-9ffb-4edd475a57eb\" (UID: \"9f77a2c7-d59b-43a4-9ffb-4edd475a57eb\") " Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.168301 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7xrx\" (UniqueName: \"kubernetes.io/projected/9f77a2c7-d59b-43a4-9ffb-4edd475a57eb-kube-api-access-w7xrx\") pod \"9f77a2c7-d59b-43a4-9ffb-4edd475a57eb\" (UID: \"9f77a2c7-d59b-43a4-9ffb-4edd475a57eb\") " Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.170268 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f77a2c7-d59b-43a4-9ffb-4edd475a57eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f77a2c7-d59b-43a4-9ffb-4edd475a57eb" (UID: "9f77a2c7-d59b-43a4-9ffb-4edd475a57eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.177741 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f77a2c7-d59b-43a4-9ffb-4edd475a57eb-kube-api-access-w7xrx" (OuterVolumeSpecName: "kube-api-access-w7xrx") pod "9f77a2c7-d59b-43a4-9ffb-4edd475a57eb" (UID: "9f77a2c7-d59b-43a4-9ffb-4edd475a57eb"). InnerVolumeSpecName "kube-api-access-w7xrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.203165 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-scfvn" Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.269723 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb815cd5-463d-4056-b4af-e9c6cf18a9c7-operator-scripts\") pod \"bb815cd5-463d-4056-b4af-e9c6cf18a9c7\" (UID: \"bb815cd5-463d-4056-b4af-e9c6cf18a9c7\") " Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.270127 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p68q\" (UniqueName: \"kubernetes.io/projected/bb815cd5-463d-4056-b4af-e9c6cf18a9c7-kube-api-access-5p68q\") pod \"bb815cd5-463d-4056-b4af-e9c6cf18a9c7\" (UID: \"bb815cd5-463d-4056-b4af-e9c6cf18a9c7\") " Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.270306 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb815cd5-463d-4056-b4af-e9c6cf18a9c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb815cd5-463d-4056-b4af-e9c6cf18a9c7" (UID: "bb815cd5-463d-4056-b4af-e9c6cf18a9c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.271264 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f77a2c7-d59b-43a4-9ffb-4edd475a57eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.271367 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb815cd5-463d-4056-b4af-e9c6cf18a9c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.271448 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7xrx\" (UniqueName: \"kubernetes.io/projected/9f77a2c7-d59b-43a4-9ffb-4edd475a57eb-kube-api-access-w7xrx\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.273692 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb815cd5-463d-4056-b4af-e9c6cf18a9c7-kube-api-access-5p68q" (OuterVolumeSpecName: "kube-api-access-5p68q") pod "bb815cd5-463d-4056-b4af-e9c6cf18a9c7" (UID: "bb815cd5-463d-4056-b4af-e9c6cf18a9c7"). InnerVolumeSpecName "kube-api-access-5p68q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.373448 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p68q\" (UniqueName: \"kubernetes.io/projected/bb815cd5-463d-4056-b4af-e9c6cf18a9c7-kube-api-access-5p68q\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.780388 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6edd-account-create-4mchv" event={"ID":"9f77a2c7-d59b-43a4-9ffb-4edd475a57eb","Type":"ContainerDied","Data":"6b12b45329615a5c56248ee4013cb703c579854316fe197b1cb623121a37101d"} Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.780852 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b12b45329615a5c56248ee4013cb703c579854316fe197b1cb623121a37101d" Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.780613 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6edd-account-create-4mchv" Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.782557 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-scfvn" Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.782698 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-scfvn" event={"ID":"bb815cd5-463d-4056-b4af-e9c6cf18a9c7","Type":"ContainerDied","Data":"62912b31a8c116bdbe13f2bedb329e9cffebf710ef5c76e22e3027781e1cb22f"} Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.782756 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62912b31a8c116bdbe13f2bedb329e9cffebf710ef5c76e22e3027781e1cb22f" Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.785423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"2bf47f48e393fe4b3e8e0dbfaa699b6ef5725e9c723895445ae377386d69fa28"} Nov 25 16:15:24 crc kubenswrapper[4743]: I1125 16:15:24.785509 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"1078504d0b933b3c35a609f1f79173e459a0e2b289dfafc0e76357f0e21b182e"} Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.598016 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hkb7f" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.605433 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dad1f368-dada-4a6b-8060-9aed7b85a828-operator-scripts\") pod \"dad1f368-dada-4a6b-8060-9aed7b85a828\" (UID: \"dad1f368-dada-4a6b-8060-9aed7b85a828\") " Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.605478 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bx9p\" (UniqueName: \"kubernetes.io/projected/dad1f368-dada-4a6b-8060-9aed7b85a828-kube-api-access-8bx9p\") pod \"dad1f368-dada-4a6b-8060-9aed7b85a828\" (UID: \"dad1f368-dada-4a6b-8060-9aed7b85a828\") " Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.606475 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dad1f368-dada-4a6b-8060-9aed7b85a828-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dad1f368-dada-4a6b-8060-9aed7b85a828" (UID: "dad1f368-dada-4a6b-8060-9aed7b85a828"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.607988 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f84a-account-create-9rs6f" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.617477 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dad1f368-dada-4a6b-8060-9aed7b85a828-kube-api-access-8bx9p" (OuterVolumeSpecName: "kube-api-access-8bx9p") pod "dad1f368-dada-4a6b-8060-9aed7b85a828" (UID: "dad1f368-dada-4a6b-8060-9aed7b85a828"). InnerVolumeSpecName "kube-api-access-8bx9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.666864 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rjl8d" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.676701 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-27cb-account-create-d8dqk" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.707058 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dad1f368-dada-4a6b-8060-9aed7b85a828-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.707089 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bx9p\" (UniqueName: \"kubernetes.io/projected/dad1f368-dada-4a6b-8060-9aed7b85a828-kube-api-access-8bx9p\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.804938 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"bc3ad6c432fd347ac4401b4e98e3f3e7298043b30f6c5659c36387b0429da239"} Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.806572 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f84a-account-create-9rs6f" event={"ID":"d466b91c-227b-424c-a235-7ef50de97f94","Type":"ContainerDied","Data":"19b96199d2b4713f5d30e442cf9e5ee9c87c7dfb58ff51c862fb309dbb568f30"} Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.806627 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19b96199d2b4713f5d30e442cf9e5ee9c87c7dfb58ff51c862fb309dbb568f30" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.806781 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f84a-account-create-9rs6f" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.807489 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qcr5\" (UniqueName: \"kubernetes.io/projected/197ff5f2-2482-4735-8a0d-8b77ed613724-kube-api-access-8qcr5\") pod \"197ff5f2-2482-4735-8a0d-8b77ed613724\" (UID: \"197ff5f2-2482-4735-8a0d-8b77ed613724\") " Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.807576 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4715ed9-7cbe-4519-84fd-db690a43a69f-operator-scripts\") pod \"d4715ed9-7cbe-4519-84fd-db690a43a69f\" (UID: \"d4715ed9-7cbe-4519-84fd-db690a43a69f\") " Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.807648 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d466b91c-227b-424c-a235-7ef50de97f94-operator-scripts\") pod \"d466b91c-227b-424c-a235-7ef50de97f94\" (UID: \"d466b91c-227b-424c-a235-7ef50de97f94\") " Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.808081 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf54n\" (UniqueName: \"kubernetes.io/projected/d4715ed9-7cbe-4519-84fd-db690a43a69f-kube-api-access-jf54n\") pod \"d4715ed9-7cbe-4519-84fd-db690a43a69f\" (UID: \"d4715ed9-7cbe-4519-84fd-db690a43a69f\") " Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.808663 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/197ff5f2-2482-4735-8a0d-8b77ed613724-operator-scripts\") pod \"197ff5f2-2482-4735-8a0d-8b77ed613724\" (UID: \"197ff5f2-2482-4735-8a0d-8b77ed613724\") " Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.809127 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mfdr\" (UniqueName: \"kubernetes.io/projected/d466b91c-227b-424c-a235-7ef50de97f94-kube-api-access-7mfdr\") pod \"d466b91c-227b-424c-a235-7ef50de97f94\" (UID: \"d466b91c-227b-424c-a235-7ef50de97f94\") " Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.808233 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d466b91c-227b-424c-a235-7ef50de97f94-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d466b91c-227b-424c-a235-7ef50de97f94" (UID: "d466b91c-227b-424c-a235-7ef50de97f94"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.808239 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4715ed9-7cbe-4519-84fd-db690a43a69f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d4715ed9-7cbe-4519-84fd-db690a43a69f" (UID: "d4715ed9-7cbe-4519-84fd-db690a43a69f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.809226 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/197ff5f2-2482-4735-8a0d-8b77ed613724-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "197ff5f2-2482-4735-8a0d-8b77ed613724" (UID: "197ff5f2-2482-4735-8a0d-8b77ed613724"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.809575 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hkb7f" event={"ID":"dad1f368-dada-4a6b-8060-9aed7b85a828","Type":"ContainerDied","Data":"1821440e4c89510703328ae6ddd6ab811ea3457dd2644a3453909412b80d02f3"} Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.809688 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1821440e4c89510703328ae6ddd6ab811ea3457dd2644a3453909412b80d02f3" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.809780 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hkb7f" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.810324 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4715ed9-7cbe-4519-84fd-db690a43a69f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.811153 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d466b91c-227b-424c-a235-7ef50de97f94-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.811241 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/197ff5f2-2482-4735-8a0d-8b77ed613724-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.811832 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rjl8d" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.811835 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rjl8d" event={"ID":"197ff5f2-2482-4735-8a0d-8b77ed613724","Type":"ContainerDied","Data":"83c3a7abbd4317a481a70e9b74ca0aa982ee48d5c144dfb88f0001d598a71d15"} Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.812041 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83c3a7abbd4317a481a70e9b74ca0aa982ee48d5c144dfb88f0001d598a71d15" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.813408 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-27cb-account-create-d8dqk" event={"ID":"d4715ed9-7cbe-4519-84fd-db690a43a69f","Type":"ContainerDied","Data":"1b1fc8680859004d3e8b5c742bd8c750abe93014db28db7cfb60477f25849a2b"} Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.813433 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b1fc8680859004d3e8b5c742bd8c750abe93014db28db7cfb60477f25849a2b" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.813487 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-27cb-account-create-d8dqk" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.823543 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/197ff5f2-2482-4735-8a0d-8b77ed613724-kube-api-access-8qcr5" (OuterVolumeSpecName: "kube-api-access-8qcr5") pod "197ff5f2-2482-4735-8a0d-8b77ed613724" (UID: "197ff5f2-2482-4735-8a0d-8b77ed613724"). InnerVolumeSpecName "kube-api-access-8qcr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.824722 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4715ed9-7cbe-4519-84fd-db690a43a69f-kube-api-access-jf54n" (OuterVolumeSpecName: "kube-api-access-jf54n") pod "d4715ed9-7cbe-4519-84fd-db690a43a69f" (UID: "d4715ed9-7cbe-4519-84fd-db690a43a69f"). InnerVolumeSpecName "kube-api-access-jf54n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.824810 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d466b91c-227b-424c-a235-7ef50de97f94-kube-api-access-7mfdr" (OuterVolumeSpecName: "kube-api-access-7mfdr") pod "d466b91c-227b-424c-a235-7ef50de97f94" (UID: "d466b91c-227b-424c-a235-7ef50de97f94"). InnerVolumeSpecName "kube-api-access-7mfdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.913240 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf54n\" (UniqueName: \"kubernetes.io/projected/d4715ed9-7cbe-4519-84fd-db690a43a69f-kube-api-access-jf54n\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.913380 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mfdr\" (UniqueName: \"kubernetes.io/projected/d466b91c-227b-424c-a235-7ef50de97f94-kube-api-access-7mfdr\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:26 crc kubenswrapper[4743]: I1125 16:15:26.913454 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qcr5\" (UniqueName: \"kubernetes.io/projected/197ff5f2-2482-4735-8a0d-8b77ed613724-kube-api-access-8qcr5\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:27 crc kubenswrapper[4743]: I1125 16:15:27.826398 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cfq9v" event={"ID":"548ce1d7-a489-4a29-9e63-56b28e48f7e1","Type":"ContainerStarted","Data":"f1be27638a5ebffe15bac10257198b404a0f5825bad193f166d7267c34625ce3"} Nov 25 16:15:27 crc kubenswrapper[4743]: I1125 16:15:27.857339 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-cfq9v" podStartSLOduration=7.55944651 podStartE2EDuration="12.857311453s" podCreationTimestamp="2025-11-25 16:15:15 +0000 UTC" firstStartedPulling="2025-11-25 16:15:21.564375278 +0000 UTC m=+1000.686214827" lastFinishedPulling="2025-11-25 16:15:26.862240221 +0000 UTC m=+1005.984079770" observedRunningTime="2025-11-25 16:15:27.850394416 +0000 UTC m=+1006.972233975" watchObservedRunningTime="2025-11-25 16:15:27.857311453 +0000 UTC m=+1006.979151012" Nov 25 16:15:29 crc kubenswrapper[4743]: I1125 16:15:29.847959 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"abd635a194f54bfce003a0dc22d170b3b3dba979ee1c6c46bbda9aee9c88102f"} Nov 25 16:15:29 crc kubenswrapper[4743]: I1125 16:15:29.848693 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"7bbb8bbd74d3888789d9dfaabbc3da0f5eaca2f158b34da73276d54b854b06d8"} Nov 25 16:15:30 crc kubenswrapper[4743]: I1125 16:15:30.867472 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"6128d5a64461ac459f4be305517f3ce13cc901a4fc730dcaae46fbf7a07e91de"} Nov 25 16:15:30 crc kubenswrapper[4743]: I1125 16:15:30.868049 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"f53542bde78f9514904a77bf05d48de2eb7ba8bd13bdf381b86eb69c831d9b9a"} Nov 25 16:15:31 crc kubenswrapper[4743]: I1125 16:15:31.879506 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"3a596de3612bfaae91a1bfc4eccc74518f184abb98b20074795289669907e1d4"} Nov 25 16:15:31 crc kubenswrapper[4743]: I1125 16:15:31.879560 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"9092899eb730dd469fac4b34a847cb65f0c3b3c4c4b35b8aee4603be3afb5332"} Nov 25 16:15:31 crc kubenswrapper[4743]: I1125 16:15:31.879574 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"7bdb1ec1d19dfad97232ebf25fdc6e4be31dad20ff125a3044ea3eabf8b2dec1"} Nov 25 16:15:32 crc kubenswrapper[4743]: I1125 16:15:32.892904 4743 generic.go:334] "Generic (PLEG): container finished" podID="548ce1d7-a489-4a29-9e63-56b28e48f7e1" containerID="f1be27638a5ebffe15bac10257198b404a0f5825bad193f166d7267c34625ce3" exitCode=0 Nov 25 16:15:32 crc kubenswrapper[4743]: I1125 16:15:32.893403 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cfq9v" event={"ID":"548ce1d7-a489-4a29-9e63-56b28e48f7e1","Type":"ContainerDied","Data":"f1be27638a5ebffe15bac10257198b404a0f5825bad193f166d7267c34625ce3"} Nov 25 16:15:32 crc kubenswrapper[4743]: I1125 16:15:32.909578 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"9599852311ca3ae9d019eb0135492a95115df701ecde5941ed365945a47951f1"} Nov 25 16:15:32 crc kubenswrapper[4743]: I1125 16:15:32.909655 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"37bb3d4d685962107605f23fa3ae676bb0b5a2bf61d98aeabb1ad3a5a5d0ebdf"} Nov 25 16:15:32 crc kubenswrapper[4743]: I1125 16:15:32.909676 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"5a16da48a2c42fbf746d7c64bca9c818c45ddb1c6787942838f5324bcc00fad7"} Nov 25 16:15:32 crc kubenswrapper[4743]: I1125 16:15:32.909695 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9ae66928-3c05-4597-98a6-f663e9df7cff","Type":"ContainerStarted","Data":"357bd4d319460efd32126f47f5e888bc4951efed978a2fd2acd03e62aa7b6563"} Nov 25 16:15:32 crc kubenswrapper[4743]: I1125 16:15:32.968262 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=42.493944779 podStartE2EDuration="51.968214277s" podCreationTimestamp="2025-11-25 16:14:41 +0000 UTC" firstStartedPulling="2025-11-25 16:15:21.664263484 +0000 UTC m=+1000.786103033" lastFinishedPulling="2025-11-25 16:15:31.138532982 +0000 UTC m=+1010.260372531" observedRunningTime="2025-11-25 16:15:32.963749427 +0000 UTC m=+1012.085589016" watchObservedRunningTime="2025-11-25 16:15:32.968214277 +0000 UTC m=+1012.090053836" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.253425 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-sq77v"] Nov 25 16:15:33 crc kubenswrapper[4743]: E1125 16:15:33.253869 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="197ff5f2-2482-4735-8a0d-8b77ed613724" containerName="mariadb-database-create" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.253890 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="197ff5f2-2482-4735-8a0d-8b77ed613724" containerName="mariadb-database-create" Nov 25 16:15:33 crc kubenswrapper[4743]: E1125 16:15:33.253907 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dad1f368-dada-4a6b-8060-9aed7b85a828" containerName="mariadb-database-create" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.253914 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad1f368-dada-4a6b-8060-9aed7b85a828" containerName="mariadb-database-create" Nov 25 16:15:33 crc kubenswrapper[4743]: E1125 16:15:33.253943 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f77a2c7-d59b-43a4-9ffb-4edd475a57eb" containerName="mariadb-account-create" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.253951 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f77a2c7-d59b-43a4-9ffb-4edd475a57eb" containerName="mariadb-account-create" Nov 25 16:15:33 crc kubenswrapper[4743]: E1125 16:15:33.253962 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4715ed9-7cbe-4519-84fd-db690a43a69f" containerName="mariadb-account-create" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.253970 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4715ed9-7cbe-4519-84fd-db690a43a69f" containerName="mariadb-account-create" Nov 25 16:15:33 crc kubenswrapper[4743]: E1125 16:15:33.253987 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d466b91c-227b-424c-a235-7ef50de97f94" containerName="mariadb-account-create" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.253994 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d466b91c-227b-424c-a235-7ef50de97f94" containerName="mariadb-account-create" Nov 25 16:15:33 crc kubenswrapper[4743]: E1125 16:15:33.254005 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb815cd5-463d-4056-b4af-e9c6cf18a9c7" containerName="mariadb-database-create" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.254011 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb815cd5-463d-4056-b4af-e9c6cf18a9c7" containerName="mariadb-database-create" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.254220 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb815cd5-463d-4056-b4af-e9c6cf18a9c7" containerName="mariadb-database-create" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.254234 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d466b91c-227b-424c-a235-7ef50de97f94" containerName="mariadb-account-create" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.254254 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4715ed9-7cbe-4519-84fd-db690a43a69f" containerName="mariadb-account-create" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.254279 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f77a2c7-d59b-43a4-9ffb-4edd475a57eb" containerName="mariadb-account-create" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.254297 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="197ff5f2-2482-4735-8a0d-8b77ed613724" containerName="mariadb-database-create" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.254307 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dad1f368-dada-4a6b-8060-9aed7b85a828" containerName="mariadb-database-create" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.255553 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.258843 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.263488 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-sq77v"] Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.429360 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.429410 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgjwd\" (UniqueName: \"kubernetes.io/projected/9664dffc-d5be-474d-8331-ab398b2e0a7e-kube-api-access-qgjwd\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.429495 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-dns-svc\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.429526 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.429543 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-config\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.429566 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.530670 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.530716 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgjwd\" (UniqueName: \"kubernetes.io/projected/9664dffc-d5be-474d-8331-ab398b2e0a7e-kube-api-access-qgjwd\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.530812 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-dns-svc\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.530849 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.530868 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-config\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.530889 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.532051 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-dns-svc\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.532100 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.532125 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.532286 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.532679 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-config\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.550237 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgjwd\" (UniqueName: \"kubernetes.io/projected/9664dffc-d5be-474d-8331-ab398b2e0a7e-kube-api-access-qgjwd\") pod \"dnsmasq-dns-764c5664d7-sq77v\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.583087 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.924783 4743 generic.go:334] "Generic (PLEG): container finished" podID="282877d5-d175-44e4-a9da-3253dd8c4d95" containerID="6e81e903379dd1bb42887a7fcb498d24ffdc267ba0f44ec87102ff49c3e5b2d1" exitCode=0 Nov 25 16:15:33 crc kubenswrapper[4743]: I1125 16:15:33.924891 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gcflw" event={"ID":"282877d5-d175-44e4-a9da-3253dd8c4d95","Type":"ContainerDied","Data":"6e81e903379dd1bb42887a7fcb498d24ffdc267ba0f44ec87102ff49c3e5b2d1"} Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.024718 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-sq77v"] Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.157154 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cfq9v" Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.340396 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/548ce1d7-a489-4a29-9e63-56b28e48f7e1-config-data\") pod \"548ce1d7-a489-4a29-9e63-56b28e48f7e1\" (UID: \"548ce1d7-a489-4a29-9e63-56b28e48f7e1\") " Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.340445 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548ce1d7-a489-4a29-9e63-56b28e48f7e1-combined-ca-bundle\") pod \"548ce1d7-a489-4a29-9e63-56b28e48f7e1\" (UID: \"548ce1d7-a489-4a29-9e63-56b28e48f7e1\") " Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.340485 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls4bc\" (UniqueName: \"kubernetes.io/projected/548ce1d7-a489-4a29-9e63-56b28e48f7e1-kube-api-access-ls4bc\") pod \"548ce1d7-a489-4a29-9e63-56b28e48f7e1\" (UID: \"548ce1d7-a489-4a29-9e63-56b28e48f7e1\") " Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.344457 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/548ce1d7-a489-4a29-9e63-56b28e48f7e1-kube-api-access-ls4bc" (OuterVolumeSpecName: "kube-api-access-ls4bc") pod "548ce1d7-a489-4a29-9e63-56b28e48f7e1" (UID: "548ce1d7-a489-4a29-9e63-56b28e48f7e1"). InnerVolumeSpecName "kube-api-access-ls4bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.363334 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548ce1d7-a489-4a29-9e63-56b28e48f7e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "548ce1d7-a489-4a29-9e63-56b28e48f7e1" (UID: "548ce1d7-a489-4a29-9e63-56b28e48f7e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.397936 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548ce1d7-a489-4a29-9e63-56b28e48f7e1-config-data" (OuterVolumeSpecName: "config-data") pod "548ce1d7-a489-4a29-9e63-56b28e48f7e1" (UID: "548ce1d7-a489-4a29-9e63-56b28e48f7e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.442077 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/548ce1d7-a489-4a29-9e63-56b28e48f7e1-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.442123 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548ce1d7-a489-4a29-9e63-56b28e48f7e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.442143 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls4bc\" (UniqueName: \"kubernetes.io/projected/548ce1d7-a489-4a29-9e63-56b28e48f7e1-kube-api-access-ls4bc\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.932716 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cfq9v" event={"ID":"548ce1d7-a489-4a29-9e63-56b28e48f7e1","Type":"ContainerDied","Data":"a20afdce0388b8d629ff70ff70d0b6c0f3bf899cc3d6b2a81e7476ce7872c856"} Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.932759 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a20afdce0388b8d629ff70ff70d0b6c0f3bf899cc3d6b2a81e7476ce7872c856" Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.932750 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cfq9v" Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.934829 4743 generic.go:334] "Generic (PLEG): container finished" podID="9664dffc-d5be-474d-8331-ab398b2e0a7e" containerID="0ff26c1c8a5050da465592c5d15007f1a6e49f4c282c48f6ed3ccf91253e9487" exitCode=0 Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.935090 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-sq77v" event={"ID":"9664dffc-d5be-474d-8331-ab398b2e0a7e","Type":"ContainerDied","Data":"0ff26c1c8a5050da465592c5d15007f1a6e49f4c282c48f6ed3ccf91253e9487"} Nov 25 16:15:34 crc kubenswrapper[4743]: I1125 16:15:34.935165 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-sq77v" event={"ID":"9664dffc-d5be-474d-8331-ab398b2e0a7e","Type":"ContainerStarted","Data":"5761e0d3be9ad32900be4a03590eec37040499aaa81eb74f6344c4b9bb079975"} Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.191883 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-sq77v"] Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.210863 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-56pnd"] Nov 25 16:15:35 crc kubenswrapper[4743]: E1125 16:15:35.211255 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548ce1d7-a489-4a29-9e63-56b28e48f7e1" containerName="keystone-db-sync" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.211275 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="548ce1d7-a489-4a29-9e63-56b28e48f7e1" containerName="keystone-db-sync" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.211477 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="548ce1d7-a489-4a29-9e63-56b28e48f7e1" containerName="keystone-db-sync" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.212082 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.218813 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.219334 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.225051 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.225075 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.236047 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-znjss" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.241622 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-6ntr5"] Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.243737 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.263721 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-56pnd"] Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.267690 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-6ntr5"] Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.353207 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6b7cc8fb89-4nbgj"] Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.359130 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.361299 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-fernet-keys\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.361337 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-config\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.361359 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-scripts\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.361380 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w77wg\" (UniqueName: \"kubernetes.io/projected/abc6c25e-56eb-438a-b332-fc7f730c33d3-kube-api-access-w77wg\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.361402 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f475\" (UniqueName: \"kubernetes.io/projected/2d7a2c56-856c-4950-9b24-c516aeb7158d-kube-api-access-7f475\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.361425 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.361443 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-config-data\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.361488 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-dns-svc\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.361501 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.361533 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-combined-ca-bundle\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.361560 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-credential-keys\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.361616 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.363609 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.366244 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-zpn8l" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.369274 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.369576 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.385554 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9vplz"] Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.386809 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.389925 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.390294 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.390556 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-8fqv7" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.433214 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b7cc8fb89-4nbgj"] Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.453276 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9vplz"] Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.462901 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f475\" (UniqueName: \"kubernetes.io/projected/2d7a2c56-856c-4950-9b24-c516aeb7158d-kube-api-access-7f475\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.462961 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-config-data\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.462996 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.463059 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-dns-svc\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.463082 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.463110 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0eccb42-7531-45d7-9b61-670d338bf6c1-config-data\") pod \"horizon-6b7cc8fb89-4nbgj\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.464616 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.468649 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-combined-ca-bundle\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.468725 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-credential-keys\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.468756 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eccb42-7531-45d7-9b61-670d338bf6c1-logs\") pod \"horizon-6b7cc8fb89-4nbgj\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.468805 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0eccb42-7531-45d7-9b61-670d338bf6c1-scripts\") pod \"horizon-6b7cc8fb89-4nbgj\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.468844 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8rq4\" (UniqueName: \"kubernetes.io/projected/c0eccb42-7531-45d7-9b61-670d338bf6c1-kube-api-access-l8rq4\") pod \"horizon-6b7cc8fb89-4nbgj\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.468880 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0eccb42-7531-45d7-9b61-670d338bf6c1-horizon-secret-key\") pod \"horizon-6b7cc8fb89-4nbgj\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.468922 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.468955 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-fernet-keys\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.468991 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-config\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.469017 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-scripts\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.469048 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w77wg\" (UniqueName: \"kubernetes.io/projected/abc6c25e-56eb-438a-b332-fc7f730c33d3-kube-api-access-w77wg\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.470748 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-dns-svc\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.471339 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.472325 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.473776 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-config\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.475783 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-config-data\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.483966 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-scripts\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.486228 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-combined-ca-bundle\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.486747 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-fernet-keys\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.488229 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-credential-keys\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.495452 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.525207 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-wdbn9"] Nov 25 16:15:35 crc kubenswrapper[4743]: E1125 16:15:35.525612 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="282877d5-d175-44e4-a9da-3253dd8c4d95" containerName="glance-db-sync" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.525631 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="282877d5-d175-44e4-a9da-3253dd8c4d95" containerName="glance-db-sync" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.525809 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="282877d5-d175-44e4-a9da-3253dd8c4d95" containerName="glance-db-sync" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.526322 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wdbn9" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.534176 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f475\" (UniqueName: \"kubernetes.io/projected/2d7a2c56-856c-4950-9b24-c516aeb7158d-kube-api-access-7f475\") pod \"dnsmasq-dns-5959f8865f-6ntr5\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.541132 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.541324 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pc24g" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.541440 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.577682 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-combined-ca-bundle\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.577729 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-scripts\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.577793 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0eccb42-7531-45d7-9b61-670d338bf6c1-config-data\") pod \"horizon-6b7cc8fb89-4nbgj\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.577816 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-config-data\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.577862 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-db-sync-config-data\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.577887 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eccb42-7531-45d7-9b61-670d338bf6c1-logs\") pod \"horizon-6b7cc8fb89-4nbgj\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.577934 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0eccb42-7531-45d7-9b61-670d338bf6c1-scripts\") pod \"horizon-6b7cc8fb89-4nbgj\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.577960 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8rq4\" (UniqueName: \"kubernetes.io/projected/c0eccb42-7531-45d7-9b61-670d338bf6c1-kube-api-access-l8rq4\") pod \"horizon-6b7cc8fb89-4nbgj\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.577995 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0eccb42-7531-45d7-9b61-670d338bf6c1-horizon-secret-key\") pod \"horizon-6b7cc8fb89-4nbgj\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.578045 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-etc-machine-id\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.578078 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmhln\" (UniqueName: \"kubernetes.io/projected/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-kube-api-access-rmhln\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.585184 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eccb42-7531-45d7-9b61-670d338bf6c1-logs\") pod \"horizon-6b7cc8fb89-4nbgj\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.585723 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0eccb42-7531-45d7-9b61-670d338bf6c1-config-data\") pod \"horizon-6b7cc8fb89-4nbgj\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.586411 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0eccb42-7531-45d7-9b61-670d338bf6c1-scripts\") pod \"horizon-6b7cc8fb89-4nbgj\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.591108 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0eccb42-7531-45d7-9b61-670d338bf6c1-horizon-secret-key\") pod \"horizon-6b7cc8fb89-4nbgj\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.596944 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7f686449-cqrx5"] Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.598511 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.614218 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8rq4\" (UniqueName: \"kubernetes.io/projected/c0eccb42-7531-45d7-9b61-670d338bf6c1-kube-api-access-l8rq4\") pod \"horizon-6b7cc8fb89-4nbgj\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.622245 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w77wg\" (UniqueName: \"kubernetes.io/projected/abc6c25e-56eb-438a-b332-fc7f730c33d3-kube-api-access-w77wg\") pod \"keystone-bootstrap-56pnd\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.622912 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wdbn9"] Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.629777 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-mqvnl"] Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.631306 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mqvnl" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.637095 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bhzgf" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.637672 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.655988 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f686449-cqrx5"] Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.657007 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.684197 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-combined-ca-bundle\") pod \"282877d5-d175-44e4-a9da-3253dd8c4d95\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.684529 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-config-data\") pod \"282877d5-d175-44e4-a9da-3253dd8c4d95\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.686708 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-db-sync-config-data\") pod \"282877d5-d175-44e4-a9da-3253dd8c4d95\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.686835 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcdhf\" (UniqueName: \"kubernetes.io/projected/282877d5-d175-44e4-a9da-3253dd8c4d95-kube-api-access-lcdhf\") pod \"282877d5-d175-44e4-a9da-3253dd8c4d95\" (UID: \"282877d5-d175-44e4-a9da-3253dd8c4d95\") " Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.687242 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-etc-machine-id\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.687332 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmhln\" (UniqueName: \"kubernetes.io/projected/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-kube-api-access-rmhln\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.687426 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4zw4\" (UniqueName: \"kubernetes.io/projected/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-kube-api-access-h4zw4\") pod \"neutron-db-sync-wdbn9\" (UID: \"24abed0a-5ed2-486b-ace3-d1b07ee69e5f\") " pod="openstack/neutron-db-sync-wdbn9" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.687512 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-combined-ca-bundle\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.687667 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-scripts\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.687776 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-combined-ca-bundle\") pod \"neutron-db-sync-wdbn9\" (UID: \"24abed0a-5ed2-486b-ace3-d1b07ee69e5f\") " pod="openstack/neutron-db-sync-wdbn9" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.687883 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-config-data\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.687977 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-db-sync-config-data\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.688059 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-config\") pod \"neutron-db-sync-wdbn9\" (UID: \"24abed0a-5ed2-486b-ace3-d1b07ee69e5f\") " pod="openstack/neutron-db-sync-wdbn9" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.695195 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-etc-machine-id\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.697796 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.709571 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-scripts\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.710573 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/282877d5-d175-44e4-a9da-3253dd8c4d95-kube-api-access-lcdhf" (OuterVolumeSpecName: "kube-api-access-lcdhf") pod "282877d5-d175-44e4-a9da-3253dd8c4d95" (UID: "282877d5-d175-44e4-a9da-3253dd8c4d95"). InnerVolumeSpecName "kube-api-access-lcdhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.712822 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-config-data\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.714159 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "282877d5-d175-44e4-a9da-3253dd8c4d95" (UID: "282877d5-d175-44e4-a9da-3253dd8c4d95"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.715167 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-db-sync-config-data\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.731221 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-combined-ca-bundle\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.735013 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mqvnl"] Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.791746 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trqlf\" (UniqueName: \"kubernetes.io/projected/dd2b63ac-91fc-49e7-8d5e-05a04879baba-kube-api-access-trqlf\") pod \"horizon-7f686449-cqrx5\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.791786 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/92e09359-debb-49f3-8490-c18e8ca5f63e-db-sync-config-data\") pod \"barbican-db-sync-mqvnl\" (UID: \"92e09359-debb-49f3-8490-c18e8ca5f63e\") " pod="openstack/barbican-db-sync-mqvnl" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.791826 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd2b63ac-91fc-49e7-8d5e-05a04879baba-scripts\") pod \"horizon-7f686449-cqrx5\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.791846 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcrlp\" (UniqueName: \"kubernetes.io/projected/92e09359-debb-49f3-8490-c18e8ca5f63e-kube-api-access-bcrlp\") pod \"barbican-db-sync-mqvnl\" (UID: \"92e09359-debb-49f3-8490-c18e8ca5f63e\") " pod="openstack/barbican-db-sync-mqvnl" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.791865 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-combined-ca-bundle\") pod \"neutron-db-sync-wdbn9\" (UID: \"24abed0a-5ed2-486b-ace3-d1b07ee69e5f\") " pod="openstack/neutron-db-sync-wdbn9" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.791905 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd2b63ac-91fc-49e7-8d5e-05a04879baba-logs\") pod \"horizon-7f686449-cqrx5\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.791931 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-config\") pod \"neutron-db-sync-wdbn9\" (UID: \"24abed0a-5ed2-486b-ace3-d1b07ee69e5f\") " pod="openstack/neutron-db-sync-wdbn9" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.791968 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd2b63ac-91fc-49e7-8d5e-05a04879baba-horizon-secret-key\") pod \"horizon-7f686449-cqrx5\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.791988 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd2b63ac-91fc-49e7-8d5e-05a04879baba-config-data\") pod \"horizon-7f686449-cqrx5\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.792037 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e09359-debb-49f3-8490-c18e8ca5f63e-combined-ca-bundle\") pod \"barbican-db-sync-mqvnl\" (UID: \"92e09359-debb-49f3-8490-c18e8ca5f63e\") " pod="openstack/barbican-db-sync-mqvnl" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.792074 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4zw4\" (UniqueName: \"kubernetes.io/projected/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-kube-api-access-h4zw4\") pod \"neutron-db-sync-wdbn9\" (UID: \"24abed0a-5ed2-486b-ace3-d1b07ee69e5f\") " pod="openstack/neutron-db-sync-wdbn9" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.792115 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.792126 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcdhf\" (UniqueName: \"kubernetes.io/projected/282877d5-d175-44e4-a9da-3253dd8c4d95-kube-api-access-lcdhf\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.795833 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmhln\" (UniqueName: \"kubernetes.io/projected/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-kube-api-access-rmhln\") pod \"cinder-db-sync-9vplz\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.807561 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-config\") pod \"neutron-db-sync-wdbn9\" (UID: \"24abed0a-5ed2-486b-ace3-d1b07ee69e5f\") " pod="openstack/neutron-db-sync-wdbn9" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.809784 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-config-data" (OuterVolumeSpecName: "config-data") pod "282877d5-d175-44e4-a9da-3253dd8c4d95" (UID: "282877d5-d175-44e4-a9da-3253dd8c4d95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.812238 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-combined-ca-bundle\") pod \"neutron-db-sync-wdbn9\" (UID: \"24abed0a-5ed2-486b-ace3-d1b07ee69e5f\") " pod="openstack/neutron-db-sync-wdbn9" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.829341 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4zw4\" (UniqueName: \"kubernetes.io/projected/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-kube-api-access-h4zw4\") pod \"neutron-db-sync-wdbn9\" (UID: \"24abed0a-5ed2-486b-ace3-d1b07ee69e5f\") " pod="openstack/neutron-db-sync-wdbn9" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.880723 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "282877d5-d175-44e4-a9da-3253dd8c4d95" (UID: "282877d5-d175-44e4-a9da-3253dd8c4d95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.881398 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.898788 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/92e09359-debb-49f3-8490-c18e8ca5f63e-db-sync-config-data\") pod \"barbican-db-sync-mqvnl\" (UID: \"92e09359-debb-49f3-8490-c18e8ca5f63e\") " pod="openstack/barbican-db-sync-mqvnl" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.898856 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd2b63ac-91fc-49e7-8d5e-05a04879baba-scripts\") pod \"horizon-7f686449-cqrx5\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.898889 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcrlp\" (UniqueName: \"kubernetes.io/projected/92e09359-debb-49f3-8490-c18e8ca5f63e-kube-api-access-bcrlp\") pod \"barbican-db-sync-mqvnl\" (UID: \"92e09359-debb-49f3-8490-c18e8ca5f63e\") " pod="openstack/barbican-db-sync-mqvnl" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.898934 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd2b63ac-91fc-49e7-8d5e-05a04879baba-logs\") pod \"horizon-7f686449-cqrx5\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.898983 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd2b63ac-91fc-49e7-8d5e-05a04879baba-horizon-secret-key\") pod \"horizon-7f686449-cqrx5\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.899008 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd2b63ac-91fc-49e7-8d5e-05a04879baba-config-data\") pod \"horizon-7f686449-cqrx5\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.899051 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e09359-debb-49f3-8490-c18e8ca5f63e-combined-ca-bundle\") pod \"barbican-db-sync-mqvnl\" (UID: \"92e09359-debb-49f3-8490-c18e8ca5f63e\") " pod="openstack/barbican-db-sync-mqvnl" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.899094 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trqlf\" (UniqueName: \"kubernetes.io/projected/dd2b63ac-91fc-49e7-8d5e-05a04879baba-kube-api-access-trqlf\") pod \"horizon-7f686449-cqrx5\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.899154 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.899170 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/282877d5-d175-44e4-a9da-3253dd8c4d95-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.902828 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/92e09359-debb-49f3-8490-c18e8ca5f63e-db-sync-config-data\") pod \"barbican-db-sync-mqvnl\" (UID: \"92e09359-debb-49f3-8490-c18e8ca5f63e\") " pod="openstack/barbican-db-sync-mqvnl" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.903635 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd2b63ac-91fc-49e7-8d5e-05a04879baba-scripts\") pod \"horizon-7f686449-cqrx5\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.904137 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd2b63ac-91fc-49e7-8d5e-05a04879baba-logs\") pod \"horizon-7f686449-cqrx5\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.906625 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd2b63ac-91fc-49e7-8d5e-05a04879baba-horizon-secret-key\") pod \"horizon-7f686449-cqrx5\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.907494 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd2b63ac-91fc-49e7-8d5e-05a04879baba-config-data\") pod \"horizon-7f686449-cqrx5\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.909952 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e09359-debb-49f3-8490-c18e8ca5f63e-combined-ca-bundle\") pod \"barbican-db-sync-mqvnl\" (UID: \"92e09359-debb-49f3-8490-c18e8ca5f63e\") " pod="openstack/barbican-db-sync-mqvnl" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.932024 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trqlf\" (UniqueName: \"kubernetes.io/projected/dd2b63ac-91fc-49e7-8d5e-05a04879baba-kube-api-access-trqlf\") pod \"horizon-7f686449-cqrx5\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.954847 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcrlp\" (UniqueName: \"kubernetes.io/projected/92e09359-debb-49f3-8490-c18e8ca5f63e-kube-api-access-bcrlp\") pod \"barbican-db-sync-mqvnl\" (UID: \"92e09359-debb-49f3-8490-c18e8ca5f63e\") " pod="openstack/barbican-db-sync-mqvnl" Nov 25 16:15:35 crc kubenswrapper[4743]: I1125 16:15:35.979171 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wdbn9" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.003064 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.006332 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gcflw" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.006720 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gcflw" event={"ID":"282877d5-d175-44e4-a9da-3253dd8c4d95","Type":"ContainerDied","Data":"e45aef583fc72255960d987f75906ee47f4d85280f59ef96b451cc14545751bf"} Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.007056 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e45aef583fc72255960d987f75906ee47f4d85280f59ef96b451cc14545751bf" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.007167 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.010155 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-6ntr5"] Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.012331 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.012496 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.021939 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9vplz" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.025667 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.044165 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-hjldx"] Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.045501 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.057062 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-hjldx"] Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.066036 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2lqmd"] Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.067322 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.070897 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.071027 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.071333 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7zrbn" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.080011 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-sq77v" event={"ID":"9664dffc-d5be-474d-8331-ab398b2e0a7e","Type":"ContainerStarted","Data":"c1e9a2a53abfef2dd5b4d34288db70eeed02543c6a01c76e5faa7d5b38d303e9"} Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.080161 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-sq77v" podUID="9664dffc-d5be-474d-8331-ab398b2e0a7e" containerName="dnsmasq-dns" containerID="cri-o://c1e9a2a53abfef2dd5b4d34288db70eeed02543c6a01c76e5faa7d5b38d303e9" gracePeriod=10 Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.080417 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.086978 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2lqmd"] Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.102249 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.102314 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1090d60-66a4-45b2-b37c-fca05d77a7c2-run-httpd\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.102363 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-scripts\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.102391 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqkzj\" (UniqueName: \"kubernetes.io/projected/e1090d60-66a4-45b2-b37c-fca05d77a7c2-kube-api-access-vqkzj\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.102419 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-config-data\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.102440 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.102478 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1090d60-66a4-45b2-b37c-fca05d77a7c2-log-httpd\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.118865 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.128070 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-sq77v" podStartSLOduration=3.128049244 podStartE2EDuration="3.128049244s" podCreationTimestamp="2025-11-25 16:15:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:15:36.105217728 +0000 UTC m=+1015.227057297" watchObservedRunningTime="2025-11-25 16:15:36.128049244 +0000 UTC m=+1015.249888793" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.141885 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mqvnl" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.208300 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2f2s\" (UniqueName: \"kubernetes.io/projected/5a037faa-f5b9-4abb-8132-2750befdf031-kube-api-access-t2f2s\") pod \"placement-db-sync-2lqmd\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.208781 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-combined-ca-bundle\") pod \"placement-db-sync-2lqmd\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.208846 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1090d60-66a4-45b2-b37c-fca05d77a7c2-log-httpd\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.208889 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.208952 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.208979 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.209123 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-config\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.209174 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1090d60-66a4-45b2-b37c-fca05d77a7c2-run-httpd\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.209205 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.209297 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz68g\" (UniqueName: \"kubernetes.io/projected/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-kube-api-access-bz68g\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.209348 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-scripts\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.209379 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a037faa-f5b9-4abb-8132-2750befdf031-logs\") pod \"placement-db-sync-2lqmd\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.209422 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-config-data\") pod \"placement-db-sync-2lqmd\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.209447 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.209481 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqkzj\" (UniqueName: \"kubernetes.io/projected/e1090d60-66a4-45b2-b37c-fca05d77a7c2-kube-api-access-vqkzj\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.209559 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-config-data\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.209584 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-scripts\") pod \"placement-db-sync-2lqmd\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.209665 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.209731 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1090d60-66a4-45b2-b37c-fca05d77a7c2-log-httpd\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.210087 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1090d60-66a4-45b2-b37c-fca05d77a7c2-run-httpd\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.220190 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.222707 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.223855 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-scripts\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.238228 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqkzj\" (UniqueName: \"kubernetes.io/projected/e1090d60-66a4-45b2-b37c-fca05d77a7c2-kube-api-access-vqkzj\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.243317 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-config-data\") pod \"ceilometer-0\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.311415 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz68g\" (UniqueName: \"kubernetes.io/projected/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-kube-api-access-bz68g\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.311482 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a037faa-f5b9-4abb-8132-2750befdf031-logs\") pod \"placement-db-sync-2lqmd\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.311508 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-config-data\") pod \"placement-db-sync-2lqmd\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.311523 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.311558 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-scripts\") pod \"placement-db-sync-2lqmd\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.311610 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2f2s\" (UniqueName: \"kubernetes.io/projected/5a037faa-f5b9-4abb-8132-2750befdf031-kube-api-access-t2f2s\") pod \"placement-db-sync-2lqmd\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.311632 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-combined-ca-bundle\") pod \"placement-db-sync-2lqmd\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.311653 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.311684 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.311718 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-config\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.311743 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.312568 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.313112 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a037faa-f5b9-4abb-8132-2750befdf031-logs\") pod \"placement-db-sync-2lqmd\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.314450 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.315018 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.315643 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-config\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.316257 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.320941 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-scripts\") pod \"placement-db-sync-2lqmd\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.324990 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-combined-ca-bundle\") pod \"placement-db-sync-2lqmd\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.334478 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-config-data\") pod \"placement-db-sync-2lqmd\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.336718 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2f2s\" (UniqueName: \"kubernetes.io/projected/5a037faa-f5b9-4abb-8132-2750befdf031-kube-api-access-t2f2s\") pod \"placement-db-sync-2lqmd\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.344682 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz68g\" (UniqueName: \"kubernetes.io/projected/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-kube-api-access-bz68g\") pod \"dnsmasq-dns-58dd9ff6bc-hjldx\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.345063 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.359953 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-hjldx"] Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.360667 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.382376 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-cg5jj"] Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.384171 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.391874 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-cg5jj"] Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.414924 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2lqmd" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.442931 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-56pnd"] Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.481680 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-6ntr5"] Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.502131 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b7cc8fb89-4nbgj"] Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.519456 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.519541 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.519603 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.519638 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtbpc\" (UniqueName: \"kubernetes.io/projected/3da25308-0a6b-49be-b9af-c010f9a1945d-kube-api-access-vtbpc\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.519711 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-config\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.519817 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.625880 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.625945 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.625972 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.625994 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtbpc\" (UniqueName: \"kubernetes.io/projected/3da25308-0a6b-49be-b9af-c010f9a1945d-kube-api-access-vtbpc\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.626049 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-config\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.626094 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.628522 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.632717 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.632863 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.634732 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.634788 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-config\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.665386 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtbpc\" (UniqueName: \"kubernetes.io/projected/3da25308-0a6b-49be-b9af-c010f9a1945d-kube-api-access-vtbpc\") pod \"dnsmasq-dns-785d8bcb8c-cg5jj\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.715386 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.746939 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wdbn9"] Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.879413 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:36 crc kubenswrapper[4743]: I1125 16:15:36.984342 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mqvnl"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.009536 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9vplz"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.032732 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-dns-svc\") pod \"9664dffc-d5be-474d-8331-ab398b2e0a7e\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.032771 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-config\") pod \"9664dffc-d5be-474d-8331-ab398b2e0a7e\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.032802 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgjwd\" (UniqueName: \"kubernetes.io/projected/9664dffc-d5be-474d-8331-ab398b2e0a7e-kube-api-access-qgjwd\") pod \"9664dffc-d5be-474d-8331-ab398b2e0a7e\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.032946 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-ovsdbserver-sb\") pod \"9664dffc-d5be-474d-8331-ab398b2e0a7e\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.033002 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-dns-swift-storage-0\") pod \"9664dffc-d5be-474d-8331-ab398b2e0a7e\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.033040 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-ovsdbserver-nb\") pod \"9664dffc-d5be-474d-8331-ab398b2e0a7e\" (UID: \"9664dffc-d5be-474d-8331-ab398b2e0a7e\") " Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.042012 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9664dffc-d5be-474d-8331-ab398b2e0a7e-kube-api-access-qgjwd" (OuterVolumeSpecName: "kube-api-access-qgjwd") pod "9664dffc-d5be-474d-8331-ab398b2e0a7e" (UID: "9664dffc-d5be-474d-8331-ab398b2e0a7e"). InnerVolumeSpecName "kube-api-access-qgjwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.092005 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9664dffc-d5be-474d-8331-ab398b2e0a7e" (UID: "9664dffc-d5be-474d-8331-ab398b2e0a7e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.096330 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" event={"ID":"2d7a2c56-856c-4950-9b24-c516aeb7158d","Type":"ContainerStarted","Data":"994c826734b1d9d0cfaaa5789a4539998924dd0ef87ddcc5f20023e27bfa649b"} Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.098703 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9664dffc-d5be-474d-8331-ab398b2e0a7e" (UID: "9664dffc-d5be-474d-8331-ab398b2e0a7e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.101695 4743 generic.go:334] "Generic (PLEG): container finished" podID="9664dffc-d5be-474d-8331-ab398b2e0a7e" containerID="c1e9a2a53abfef2dd5b4d34288db70eeed02543c6a01c76e5faa7d5b38d303e9" exitCode=0 Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.101772 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-sq77v" event={"ID":"9664dffc-d5be-474d-8331-ab398b2e0a7e","Type":"ContainerDied","Data":"c1e9a2a53abfef2dd5b4d34288db70eeed02543c6a01c76e5faa7d5b38d303e9"} Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.108746 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-sq77v" event={"ID":"9664dffc-d5be-474d-8331-ab398b2e0a7e","Type":"ContainerDied","Data":"5761e0d3be9ad32900be4a03590eec37040499aaa81eb74f6344c4b9bb079975"} Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.108791 4743 scope.go:117] "RemoveContainer" containerID="c1e9a2a53abfef2dd5b4d34288db70eeed02543c6a01c76e5faa7d5b38d303e9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.103416 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-sq77v" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.113475 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-config" (OuterVolumeSpecName: "config") pod "9664dffc-d5be-474d-8331-ab398b2e0a7e" (UID: "9664dffc-d5be-474d-8331-ab398b2e0a7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.119664 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9664dffc-d5be-474d-8331-ab398b2e0a7e" (UID: "9664dffc-d5be-474d-8331-ab398b2e0a7e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.122995 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mqvnl" event={"ID":"92e09359-debb-49f3-8490-c18e8ca5f63e","Type":"ContainerStarted","Data":"4ad6177fdf25772bd0b04add034d299083beb0a5fa849328fb0a329df93fad5e"} Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.124461 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9vplz" event={"ID":"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba","Type":"ContainerStarted","Data":"c87884a35cbd45f2a3b6895e6f2ac0dd58380afebe645319a92d2777ad61a61a"} Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.137183 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b7cc8fb89-4nbgj" event={"ID":"c0eccb42-7531-45d7-9b61-670d338bf6c1","Type":"ContainerStarted","Data":"1441ccc229b6a7f34ad8f26634bde15e724b6e9df90a3f4a32dd13c9ea5d8c45"} Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.137564 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.137613 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.137628 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.137655 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.137668 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgjwd\" (UniqueName: \"kubernetes.io/projected/9664dffc-d5be-474d-8331-ab398b2e0a7e-kube-api-access-qgjwd\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.146431 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9664dffc-d5be-474d-8331-ab398b2e0a7e" (UID: "9664dffc-d5be-474d-8331-ab398b2e0a7e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.157445 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-56pnd" event={"ID":"abc6c25e-56eb-438a-b332-fc7f730c33d3","Type":"ContainerStarted","Data":"084bc4770ca685ecb9cdd01ea1cf180cfbcff2c8b6f47a8ed323ee1daed401fe"} Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.180541 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.184195 4743 scope.go:117] "RemoveContainer" containerID="0ff26c1c8a5050da465592c5d15007f1a6e49f4c282c48f6ed3ccf91253e9487" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.194103 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wdbn9" event={"ID":"24abed0a-5ed2-486b-ace3-d1b07ee69e5f","Type":"ContainerStarted","Data":"db9a6585ff487984a1f73748b8343b4c6805ea01bcff9da6c821b032f2afa1a9"} Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.218009 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-hjldx"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.224276 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-56pnd" podStartSLOduration=2.224255261 podStartE2EDuration="2.224255261s" podCreationTimestamp="2025-11-25 16:15:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:15:37.176220503 +0000 UTC m=+1016.298060052" watchObservedRunningTime="2025-11-25 16:15:37.224255261 +0000 UTC m=+1016.346094810" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.245666 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9664dffc-d5be-474d-8331-ab398b2e0a7e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.247570 4743 scope.go:117] "RemoveContainer" containerID="c1e9a2a53abfef2dd5b4d34288db70eeed02543c6a01c76e5faa7d5b38d303e9" Nov 25 16:15:37 crc kubenswrapper[4743]: E1125 16:15:37.250634 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1e9a2a53abfef2dd5b4d34288db70eeed02543c6a01c76e5faa7d5b38d303e9\": container with ID starting with c1e9a2a53abfef2dd5b4d34288db70eeed02543c6a01c76e5faa7d5b38d303e9 not found: ID does not exist" containerID="c1e9a2a53abfef2dd5b4d34288db70eeed02543c6a01c76e5faa7d5b38d303e9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.250687 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1e9a2a53abfef2dd5b4d34288db70eeed02543c6a01c76e5faa7d5b38d303e9"} err="failed to get container status \"c1e9a2a53abfef2dd5b4d34288db70eeed02543c6a01c76e5faa7d5b38d303e9\": rpc error: code = NotFound desc = could not find container \"c1e9a2a53abfef2dd5b4d34288db70eeed02543c6a01c76e5faa7d5b38d303e9\": container with ID starting with c1e9a2a53abfef2dd5b4d34288db70eeed02543c6a01c76e5faa7d5b38d303e9 not found: ID does not exist" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.250717 4743 scope.go:117] "RemoveContainer" containerID="0ff26c1c8a5050da465592c5d15007f1a6e49f4c282c48f6ed3ccf91253e9487" Nov 25 16:15:37 crc kubenswrapper[4743]: E1125 16:15:37.252349 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ff26c1c8a5050da465592c5d15007f1a6e49f4c282c48f6ed3ccf91253e9487\": container with ID starting with 0ff26c1c8a5050da465592c5d15007f1a6e49f4c282c48f6ed3ccf91253e9487 not found: ID does not exist" containerID="0ff26c1c8a5050da465592c5d15007f1a6e49f4c282c48f6ed3ccf91253e9487" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.252414 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ff26c1c8a5050da465592c5d15007f1a6e49f4c282c48f6ed3ccf91253e9487"} err="failed to get container status \"0ff26c1c8a5050da465592c5d15007f1a6e49f4c282c48f6ed3ccf91253e9487\": rpc error: code = NotFound desc = could not find container \"0ff26c1c8a5050da465592c5d15007f1a6e49f4c282c48f6ed3ccf91253e9487\": container with ID starting with 0ff26c1c8a5050da465592c5d15007f1a6e49f4c282c48f6ed3ccf91253e9487 not found: ID does not exist" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.269197 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f686449-cqrx5"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.276035 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:15:37 crc kubenswrapper[4743]: E1125 16:15:37.276573 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9664dffc-d5be-474d-8331-ab398b2e0a7e" containerName="init" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.276712 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9664dffc-d5be-474d-8331-ab398b2e0a7e" containerName="init" Nov 25 16:15:37 crc kubenswrapper[4743]: E1125 16:15:37.276749 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9664dffc-d5be-474d-8331-ab398b2e0a7e" containerName="dnsmasq-dns" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.276756 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9664dffc-d5be-474d-8331-ab398b2e0a7e" containerName="dnsmasq-dns" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.277006 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9664dffc-d5be-474d-8331-ab398b2e0a7e" containerName="dnsmasq-dns" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.278360 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.281174 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bdp5k" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.281332 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.282262 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.291673 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.312323 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2lqmd"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.374495 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-cg5jj"] Nov 25 16:15:37 crc kubenswrapper[4743]: W1125 16:15:37.379848 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3da25308_0a6b_49be_b9af_c010f9a1945d.slice/crio-3c77e6dc3182c6bf9d282ced5a542a003d4ef9dc4b2429da04df5322e56a3ed1 WatchSource:0}: Error finding container 3c77e6dc3182c6bf9d282ced5a542a003d4ef9dc4b2429da04df5322e56a3ed1: Status 404 returned error can't find the container with id 3c77e6dc3182c6bf9d282ced5a542a003d4ef9dc4b2429da04df5322e56a3ed1 Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.449486 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a309ab52-3ea3-418d-8f6e-2370bad135d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.449585 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.449639 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.449681 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.449703 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz6gc\" (UniqueName: \"kubernetes.io/projected/a309ab52-3ea3-418d-8f6e-2370bad135d9-kube-api-access-qz6gc\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.449798 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a309ab52-3ea3-418d-8f6e-2370bad135d9-logs\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.449817 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.460203 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-sq77v"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.465809 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-sq77v"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.545397 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.549891 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.551061 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a309ab52-3ea3-418d-8f6e-2370bad135d9-logs\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.551098 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.551142 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a309ab52-3ea3-418d-8f6e-2370bad135d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.551196 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.551241 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.551286 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.551303 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz6gc\" (UniqueName: \"kubernetes.io/projected/a309ab52-3ea3-418d-8f6e-2370bad135d9-kube-api-access-qz6gc\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.552684 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.553800 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.557140 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a309ab52-3ea3-418d-8f6e-2370bad135d9-logs\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.560121 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a309ab52-3ea3-418d-8f6e-2370bad135d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.577720 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.582735 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.600206 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.608321 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.659051 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kftdb\" (UniqueName: \"kubernetes.io/projected/97694e36-d7dd-42e8-85bc-3aa41ccb6364-kube-api-access-kftdb\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.659092 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.659128 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97694e36-d7dd-42e8-85bc-3aa41ccb6364-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.659194 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-config-data\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.659227 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-scripts\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.659248 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.659470 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97694e36-d7dd-42e8-85bc-3aa41ccb6364-logs\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.676177 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b7cc8fb89-4nbgj"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.677219 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.710694 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.722225 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz6gc\" (UniqueName: \"kubernetes.io/projected/a309ab52-3ea3-418d-8f6e-2370bad135d9-kube-api-access-qz6gc\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.723662 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:15:37 crc kubenswrapper[4743]: E1125 16:15:37.724339 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="a309ab52-3ea3-418d-8f6e-2370bad135d9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.750518 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-84d446b49-dttj9"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.752435 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.756547 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84d446b49-dttj9"] Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.760622 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.763686 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-config-data\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.763753 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-scripts\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.763777 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.763819 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97694e36-d7dd-42e8-85bc-3aa41ccb6364-logs\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.763872 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kftdb\" (UniqueName: \"kubernetes.io/projected/97694e36-d7dd-42e8-85bc-3aa41ccb6364-kube-api-access-kftdb\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.763915 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.763941 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97694e36-d7dd-42e8-85bc-3aa41ccb6364-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.764482 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97694e36-d7dd-42e8-85bc-3aa41ccb6364-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.765422 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.771888 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97694e36-d7dd-42e8-85bc-3aa41ccb6364-logs\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.802399 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9664dffc-d5be-474d-8331-ab398b2e0a7e" path="/var/lib/kubelet/pods/9664dffc-d5be-474d-8331-ab398b2e0a7e/volumes" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.803859 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.806609 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-scripts\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.807467 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-config-data\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.818207 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kftdb\" (UniqueName: \"kubernetes.io/projected/97694e36-d7dd-42e8-85bc-3aa41ccb6364-kube-api-access-kftdb\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.865665 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f65e718-4161-401d-88fd-50c615ce803b-config-data\") pod \"horizon-84d446b49-dttj9\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.865803 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f65e718-4161-401d-88fd-50c615ce803b-scripts\") pod \"horizon-84d446b49-dttj9\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.865901 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f65e718-4161-401d-88fd-50c615ce803b-logs\") pod \"horizon-84d446b49-dttj9\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.865923 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2f65e718-4161-401d-88fd-50c615ce803b-horizon-secret-key\") pod \"horizon-84d446b49-dttj9\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.865945 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdp9z\" (UniqueName: \"kubernetes.io/projected/2f65e718-4161-401d-88fd-50c615ce803b-kube-api-access-kdp9z\") pod \"horizon-84d446b49-dttj9\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.881724 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.968186 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f65e718-4161-401d-88fd-50c615ce803b-logs\") pod \"horizon-84d446b49-dttj9\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.968231 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2f65e718-4161-401d-88fd-50c615ce803b-horizon-secret-key\") pod \"horizon-84d446b49-dttj9\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.968255 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdp9z\" (UniqueName: \"kubernetes.io/projected/2f65e718-4161-401d-88fd-50c615ce803b-kube-api-access-kdp9z\") pod \"horizon-84d446b49-dttj9\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.968312 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f65e718-4161-401d-88fd-50c615ce803b-config-data\") pod \"horizon-84d446b49-dttj9\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.968394 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f65e718-4161-401d-88fd-50c615ce803b-scripts\") pod \"horizon-84d446b49-dttj9\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.968898 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f65e718-4161-401d-88fd-50c615ce803b-logs\") pod \"horizon-84d446b49-dttj9\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.969757 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f65e718-4161-401d-88fd-50c615ce803b-scripts\") pod \"horizon-84d446b49-dttj9\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.970701 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f65e718-4161-401d-88fd-50c615ce803b-config-data\") pod \"horizon-84d446b49-dttj9\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.994860 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdp9z\" (UniqueName: \"kubernetes.io/projected/2f65e718-4161-401d-88fd-50c615ce803b-kube-api-access-kdp9z\") pod \"horizon-84d446b49-dttj9\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:37 crc kubenswrapper[4743]: I1125 16:15:37.995047 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2f65e718-4161-401d-88fd-50c615ce803b-horizon-secret-key\") pod \"horizon-84d446b49-dttj9\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.056338 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.059854 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.215945 4743 generic.go:334] "Generic (PLEG): container finished" podID="3da25308-0a6b-49be-b9af-c010f9a1945d" containerID="97af644eeffd8f3c88600642ff8cd8513feb6479e102525f5042d02208bfdbf9" exitCode=0 Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.216108 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" event={"ID":"3da25308-0a6b-49be-b9af-c010f9a1945d","Type":"ContainerDied","Data":"97af644eeffd8f3c88600642ff8cd8513feb6479e102525f5042d02208bfdbf9"} Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.216144 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" event={"ID":"3da25308-0a6b-49be-b9af-c010f9a1945d","Type":"ContainerStarted","Data":"3c77e6dc3182c6bf9d282ced5a542a003d4ef9dc4b2429da04df5322e56a3ed1"} Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.225186 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f686449-cqrx5" event={"ID":"dd2b63ac-91fc-49e7-8d5e-05a04879baba","Type":"ContainerStarted","Data":"9b72acd192156704f3a9226c59c34d6731dde1eeca0cc51267542649c301991a"} Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.244889 4743 generic.go:334] "Generic (PLEG): container finished" podID="2d7a2c56-856c-4950-9b24-c516aeb7158d" containerID="61dca0d84003704c9c7899b2a7b6b84d3852d755d1ec6332e7175c5406b24b0c" exitCode=0 Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.244889 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" event={"ID":"2d7a2c56-856c-4950-9b24-c516aeb7158d","Type":"ContainerDied","Data":"61dca0d84003704c9c7899b2a7b6b84d3852d755d1ec6332e7175c5406b24b0c"} Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.250391 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2lqmd" event={"ID":"5a037faa-f5b9-4abb-8132-2750befdf031","Type":"ContainerStarted","Data":"0278e84e216431654f027622aca6ab372c9a0a02b8e29a33297c26ec29c591c1"} Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.315914 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wdbn9" event={"ID":"24abed0a-5ed2-486b-ace3-d1b07ee69e5f","Type":"ContainerStarted","Data":"eb73eed19eaf8f467c1547d00948479da0a5657dc4d8bfea99addae5c9fcd4e0"} Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.320381 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-56pnd" event={"ID":"abc6c25e-56eb-438a-b332-fc7f730c33d3","Type":"ContainerStarted","Data":"4e14ec9661e427d9de11a7735eb9aa7c07fc4a0622749c8d48ae6dcc47377e33"} Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.355126 4743 generic.go:334] "Generic (PLEG): container finished" podID="39b7ff58-2347-4d8d-a544-c960ebcf6cc0" containerID="9905d997af7dfaea959a30cafe582dcb719eb022c06a0907c00668faf0a7fbca" exitCode=0 Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.355282 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" event={"ID":"39b7ff58-2347-4d8d-a544-c960ebcf6cc0","Type":"ContainerDied","Data":"9905d997af7dfaea959a30cafe582dcb719eb022c06a0907c00668faf0a7fbca"} Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.355332 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" event={"ID":"39b7ff58-2347-4d8d-a544-c960ebcf6cc0","Type":"ContainerStarted","Data":"d7710949172267ae5788a3bc10dfb10ccdb6083544d169deb40aef030c2f3a33"} Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.360864 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1090d60-66a4-45b2-b37c-fca05d77a7c2","Type":"ContainerStarted","Data":"2ccb2949efe37ffa7e6915b738b1ebfc9f2c26faaef5b9912f80767aaf9cb39a"} Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.360990 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.373884 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-wdbn9" podStartSLOduration=3.373866575 podStartE2EDuration="3.373866575s" podCreationTimestamp="2025-11-25 16:15:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:15:38.338899107 +0000 UTC m=+1017.460738656" watchObservedRunningTime="2025-11-25 16:15:38.373866575 +0000 UTC m=+1017.495706124" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.384922 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.498079 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz6gc\" (UniqueName: \"kubernetes.io/projected/a309ab52-3ea3-418d-8f6e-2370bad135d9-kube-api-access-qz6gc\") pod \"a309ab52-3ea3-418d-8f6e-2370bad135d9\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.498137 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"a309ab52-3ea3-418d-8f6e-2370bad135d9\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.498176 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a309ab52-3ea3-418d-8f6e-2370bad135d9-httpd-run\") pod \"a309ab52-3ea3-418d-8f6e-2370bad135d9\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.498236 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-scripts\") pod \"a309ab52-3ea3-418d-8f6e-2370bad135d9\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.498253 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a309ab52-3ea3-418d-8f6e-2370bad135d9-logs\") pod \"a309ab52-3ea3-418d-8f6e-2370bad135d9\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.498347 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-combined-ca-bundle\") pod \"a309ab52-3ea3-418d-8f6e-2370bad135d9\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.498372 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-config-data\") pod \"a309ab52-3ea3-418d-8f6e-2370bad135d9\" (UID: \"a309ab52-3ea3-418d-8f6e-2370bad135d9\") " Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.500333 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a309ab52-3ea3-418d-8f6e-2370bad135d9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a309ab52-3ea3-418d-8f6e-2370bad135d9" (UID: "a309ab52-3ea3-418d-8f6e-2370bad135d9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.500774 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a309ab52-3ea3-418d-8f6e-2370bad135d9-logs" (OuterVolumeSpecName: "logs") pod "a309ab52-3ea3-418d-8f6e-2370bad135d9" (UID: "a309ab52-3ea3-418d-8f6e-2370bad135d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.505548 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a309ab52-3ea3-418d-8f6e-2370bad135d9-kube-api-access-qz6gc" (OuterVolumeSpecName: "kube-api-access-qz6gc") pod "a309ab52-3ea3-418d-8f6e-2370bad135d9" (UID: "a309ab52-3ea3-418d-8f6e-2370bad135d9"). InnerVolumeSpecName "kube-api-access-qz6gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.505956 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "a309ab52-3ea3-418d-8f6e-2370bad135d9" (UID: "a309ab52-3ea3-418d-8f6e-2370bad135d9"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.507279 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-scripts" (OuterVolumeSpecName: "scripts") pod "a309ab52-3ea3-418d-8f6e-2370bad135d9" (UID: "a309ab52-3ea3-418d-8f6e-2370bad135d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.507267 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-config-data" (OuterVolumeSpecName: "config-data") pod "a309ab52-3ea3-418d-8f6e-2370bad135d9" (UID: "a309ab52-3ea3-418d-8f6e-2370bad135d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.508421 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a309ab52-3ea3-418d-8f6e-2370bad135d9" (UID: "a309ab52-3ea3-418d-8f6e-2370bad135d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.600302 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz6gc\" (UniqueName: \"kubernetes.io/projected/a309ab52-3ea3-418d-8f6e-2370bad135d9-kube-api-access-qz6gc\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.601489 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.601502 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a309ab52-3ea3-418d-8f6e-2370bad135d9-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.601511 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.601521 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a309ab52-3ea3-418d-8f6e-2370bad135d9-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.601529 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.601538 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a309ab52-3ea3-418d-8f6e-2370bad135d9-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.640918 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.703164 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.902932 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84d446b49-dttj9"] Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.956471 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:38 crc kubenswrapper[4743]: I1125 16:15:38.967961 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.114639 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f475\" (UniqueName: \"kubernetes.io/projected/2d7a2c56-856c-4950-9b24-c516aeb7158d-kube-api-access-7f475\") pod \"2d7a2c56-856c-4950-9b24-c516aeb7158d\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.115027 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-ovsdbserver-nb\") pod \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.115094 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-config\") pod \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.115126 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-ovsdbserver-sb\") pod \"2d7a2c56-856c-4950-9b24-c516aeb7158d\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.115276 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-dns-svc\") pod \"2d7a2c56-856c-4950-9b24-c516aeb7158d\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.115305 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-ovsdbserver-nb\") pod \"2d7a2c56-856c-4950-9b24-c516aeb7158d\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.115352 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz68g\" (UniqueName: \"kubernetes.io/projected/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-kube-api-access-bz68g\") pod \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.115394 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-config\") pod \"2d7a2c56-856c-4950-9b24-c516aeb7158d\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.115429 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-dns-swift-storage-0\") pod \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.115451 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-dns-swift-storage-0\") pod \"2d7a2c56-856c-4950-9b24-c516aeb7158d\" (UID: \"2d7a2c56-856c-4950-9b24-c516aeb7158d\") " Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.115488 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-ovsdbserver-sb\") pod \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.115514 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-dns-svc\") pod \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\" (UID: \"39b7ff58-2347-4d8d-a544-c960ebcf6cc0\") " Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.134290 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-kube-api-access-bz68g" (OuterVolumeSpecName: "kube-api-access-bz68g") pod "39b7ff58-2347-4d8d-a544-c960ebcf6cc0" (UID: "39b7ff58-2347-4d8d-a544-c960ebcf6cc0"). InnerVolumeSpecName "kube-api-access-bz68g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.142425 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.145514 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d7a2c56-856c-4950-9b24-c516aeb7158d-kube-api-access-7f475" (OuterVolumeSpecName: "kube-api-access-7f475") pod "2d7a2c56-856c-4950-9b24-c516aeb7158d" (UID: "2d7a2c56-856c-4950-9b24-c516aeb7158d"). InnerVolumeSpecName "kube-api-access-7f475". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.150728 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-config" (OuterVolumeSpecName: "config") pod "39b7ff58-2347-4d8d-a544-c960ebcf6cc0" (UID: "39b7ff58-2347-4d8d-a544-c960ebcf6cc0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.154399 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "39b7ff58-2347-4d8d-a544-c960ebcf6cc0" (UID: "39b7ff58-2347-4d8d-a544-c960ebcf6cc0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.157498 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2d7a2c56-856c-4950-9b24-c516aeb7158d" (UID: "2d7a2c56-856c-4950-9b24-c516aeb7158d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.165709 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2d7a2c56-856c-4950-9b24-c516aeb7158d" (UID: "2d7a2c56-856c-4950-9b24-c516aeb7158d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.168884 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "39b7ff58-2347-4d8d-a544-c960ebcf6cc0" (UID: "39b7ff58-2347-4d8d-a544-c960ebcf6cc0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.177002 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2d7a2c56-856c-4950-9b24-c516aeb7158d" (UID: "2d7a2c56-856c-4950-9b24-c516aeb7158d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.182006 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "39b7ff58-2347-4d8d-a544-c960ebcf6cc0" (UID: "39b7ff58-2347-4d8d-a544-c960ebcf6cc0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.184652 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-config" (OuterVolumeSpecName: "config") pod "2d7a2c56-856c-4950-9b24-c516aeb7158d" (UID: "2d7a2c56-856c-4950-9b24-c516aeb7158d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.196371 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "39b7ff58-2347-4d8d-a544-c960ebcf6cc0" (UID: "39b7ff58-2347-4d8d-a544-c960ebcf6cc0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.203450 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2d7a2c56-856c-4950-9b24-c516aeb7158d" (UID: "2d7a2c56-856c-4950-9b24-c516aeb7158d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.217864 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.217904 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.217918 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.217927 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.217941 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz68g\" (UniqueName: \"kubernetes.io/projected/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-kube-api-access-bz68g\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.217950 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.217961 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.217972 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d7a2c56-856c-4950-9b24-c516aeb7158d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.217982 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.217991 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.218001 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f475\" (UniqueName: \"kubernetes.io/projected/2d7a2c56-856c-4950-9b24-c516aeb7158d-kube-api-access-7f475\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.218011 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39b7ff58-2347-4d8d-a544-c960ebcf6cc0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.386433 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" event={"ID":"39b7ff58-2347-4d8d-a544-c960ebcf6cc0","Type":"ContainerDied","Data":"d7710949172267ae5788a3bc10dfb10ccdb6083544d169deb40aef030c2f3a33"} Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.386490 4743 scope.go:117] "RemoveContainer" containerID="9905d997af7dfaea959a30cafe582dcb719eb022c06a0907c00668faf0a7fbca" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.386655 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-hjldx" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.402879 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" event={"ID":"3da25308-0a6b-49be-b9af-c010f9a1945d","Type":"ContainerStarted","Data":"4c7d7730b31c5b26b34a28f8bdd4da286b0ede847d9f63cd84b135dc5302dd8e"} Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.403039 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.408552 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" event={"ID":"2d7a2c56-856c-4950-9b24-c516aeb7158d","Type":"ContainerDied","Data":"994c826734b1d9d0cfaaa5789a4539998924dd0ef87ddcc5f20023e27bfa649b"} Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.408650 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-6ntr5" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.415311 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84d446b49-dttj9" event={"ID":"2f65e718-4161-401d-88fd-50c615ce803b","Type":"ContainerStarted","Data":"eef7a2bf495de7c23b0047f8d720e7ada75ff73c6bb565ee7d0505da5eb0f043"} Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.425754 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.425901 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97694e36-d7dd-42e8-85bc-3aa41ccb6364","Type":"ContainerStarted","Data":"0ea61c58739f5191ad68be584e524fe0e9c3a67ce195ac847988cff6b48a6d66"} Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.459840 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-hjldx"] Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.471822 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-hjldx"] Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.476855 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" podStartSLOduration=3.476833164 podStartE2EDuration="3.476833164s" podCreationTimestamp="2025-11-25 16:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:15:39.454153032 +0000 UTC m=+1018.575992581" watchObservedRunningTime="2025-11-25 16:15:39.476833164 +0000 UTC m=+1018.598672713" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.504764 4743 scope.go:117] "RemoveContainer" containerID="61dca0d84003704c9c7899b2a7b6b84d3852d755d1ec6332e7175c5406b24b0c" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.522669 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-6ntr5"] Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.535111 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-6ntr5"] Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.548654 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.593152 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.624089 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:15:39 crc kubenswrapper[4743]: E1125 16:15:39.624467 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b7ff58-2347-4d8d-a544-c960ebcf6cc0" containerName="init" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.624479 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b7ff58-2347-4d8d-a544-c960ebcf6cc0" containerName="init" Nov 25 16:15:39 crc kubenswrapper[4743]: E1125 16:15:39.624544 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d7a2c56-856c-4950-9b24-c516aeb7158d" containerName="init" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.624552 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d7a2c56-856c-4950-9b24-c516aeb7158d" containerName="init" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.624791 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d7a2c56-856c-4950-9b24-c516aeb7158d" containerName="init" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.624802 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="39b7ff58-2347-4d8d-a544-c960ebcf6cc0" containerName="init" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.629791 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.633546 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.659242 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.772196 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-logs\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.772239 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.772266 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-config-data\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.772284 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wmtz\" (UniqueName: \"kubernetes.io/projected/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-kube-api-access-4wmtz\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.772510 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.772655 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-scripts\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.772825 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.822204 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d7a2c56-856c-4950-9b24-c516aeb7158d" path="/var/lib/kubelet/pods/2d7a2c56-856c-4950-9b24-c516aeb7158d/volumes" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.822724 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39b7ff58-2347-4d8d-a544-c960ebcf6cc0" path="/var/lib/kubelet/pods/39b7ff58-2347-4d8d-a544-c960ebcf6cc0/volumes" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.823300 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a309ab52-3ea3-418d-8f6e-2370bad135d9" path="/var/lib/kubelet/pods/a309ab52-3ea3-418d-8f6e-2370bad135d9/volumes" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.874040 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.874580 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.874822 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-scripts\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.874985 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.875153 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-logs\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.875212 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.875272 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-config-data\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.875301 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wmtz\" (UniqueName: \"kubernetes.io/projected/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-kube-api-access-4wmtz\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.875560 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.875823 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-logs\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.880752 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-scripts\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.884868 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-config-data\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.899286 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wmtz\" (UniqueName: \"kubernetes.io/projected/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-kube-api-access-4wmtz\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.908377 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.922716 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " pod="openstack/glance-default-external-api-0" Nov 25 16:15:39 crc kubenswrapper[4743]: I1125 16:15:39.985670 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 16:15:40 crc kubenswrapper[4743]: I1125 16:15:40.442763 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97694e36-d7dd-42e8-85bc-3aa41ccb6364","Type":"ContainerStarted","Data":"d999381c5b7ec9406b6f8bf5a3ac48472f5e9ee7112e1b2edfb2ab3164b57425"} Nov 25 16:15:40 crc kubenswrapper[4743]: I1125 16:15:40.532308 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:15:41 crc kubenswrapper[4743]: I1125 16:15:41.471196 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97694e36-d7dd-42e8-85bc-3aa41ccb6364","Type":"ContainerStarted","Data":"9b4b6267173eb5e87431cb9bfa2b29adcf4cfd13bcb37c35bb732862b8c0f5e8"} Nov 25 16:15:41 crc kubenswrapper[4743]: I1125 16:15:41.471294 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="97694e36-d7dd-42e8-85bc-3aa41ccb6364" containerName="glance-log" containerID="cri-o://d999381c5b7ec9406b6f8bf5a3ac48472f5e9ee7112e1b2edfb2ab3164b57425" gracePeriod=30 Nov 25 16:15:41 crc kubenswrapper[4743]: I1125 16:15:41.471335 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="97694e36-d7dd-42e8-85bc-3aa41ccb6364" containerName="glance-httpd" containerID="cri-o://9b4b6267173eb5e87431cb9bfa2b29adcf4cfd13bcb37c35bb732862b8c0f5e8" gracePeriod=30 Nov 25 16:15:41 crc kubenswrapper[4743]: I1125 16:15:41.476371 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd","Type":"ContainerStarted","Data":"5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5"} Nov 25 16:15:41 crc kubenswrapper[4743]: I1125 16:15:41.476552 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd","Type":"ContainerStarted","Data":"0e935af4ff25bf7e2b902218ffbf22b4297a84b3e1e3c86b9bd6cf70b1484b5b"} Nov 25 16:15:41 crc kubenswrapper[4743]: I1125 16:15:41.499666 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.499640463 podStartE2EDuration="5.499640463s" podCreationTimestamp="2025-11-25 16:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:15:41.489121423 +0000 UTC m=+1020.610960972" watchObservedRunningTime="2025-11-25 16:15:41.499640463 +0000 UTC m=+1020.621480012" Nov 25 16:15:42 crc kubenswrapper[4743]: I1125 16:15:42.489439 4743 generic.go:334] "Generic (PLEG): container finished" podID="abc6c25e-56eb-438a-b332-fc7f730c33d3" containerID="4e14ec9661e427d9de11a7735eb9aa7c07fc4a0622749c8d48ae6dcc47377e33" exitCode=0 Nov 25 16:15:42 crc kubenswrapper[4743]: I1125 16:15:42.489521 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-56pnd" event={"ID":"abc6c25e-56eb-438a-b332-fc7f730c33d3","Type":"ContainerDied","Data":"4e14ec9661e427d9de11a7735eb9aa7c07fc4a0622749c8d48ae6dcc47377e33"} Nov 25 16:15:42 crc kubenswrapper[4743]: I1125 16:15:42.495247 4743 generic.go:334] "Generic (PLEG): container finished" podID="97694e36-d7dd-42e8-85bc-3aa41ccb6364" containerID="9b4b6267173eb5e87431cb9bfa2b29adcf4cfd13bcb37c35bb732862b8c0f5e8" exitCode=0 Nov 25 16:15:42 crc kubenswrapper[4743]: I1125 16:15:42.495303 4743 generic.go:334] "Generic (PLEG): container finished" podID="97694e36-d7dd-42e8-85bc-3aa41ccb6364" containerID="d999381c5b7ec9406b6f8bf5a3ac48472f5e9ee7112e1b2edfb2ab3164b57425" exitCode=143 Nov 25 16:15:42 crc kubenswrapper[4743]: I1125 16:15:42.495332 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97694e36-d7dd-42e8-85bc-3aa41ccb6364","Type":"ContainerDied","Data":"9b4b6267173eb5e87431cb9bfa2b29adcf4cfd13bcb37c35bb732862b8c0f5e8"} Nov 25 16:15:42 crc kubenswrapper[4743]: I1125 16:15:42.495383 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97694e36-d7dd-42e8-85bc-3aa41ccb6364","Type":"ContainerDied","Data":"d999381c5b7ec9406b6f8bf5a3ac48472f5e9ee7112e1b2edfb2ab3164b57425"} Nov 25 16:15:46 crc kubenswrapper[4743]: I1125 16:15:46.719537 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:15:46 crc kubenswrapper[4743]: I1125 16:15:46.782195 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-q4m8b"] Nov 25 16:15:46 crc kubenswrapper[4743]: I1125 16:15:46.782474 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-q4m8b" podUID="ad5e0061-9982-4b48-b8f3-877ecc734668" containerName="dnsmasq-dns" containerID="cri-o://5209329b204f2ec2f717266243f9212ac731a17bd362f2758a97567e6a836f26" gracePeriod=10 Nov 25 16:15:47 crc kubenswrapper[4743]: I1125 16:15:47.285185 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-q4m8b" podUID="ad5e0061-9982-4b48-b8f3-877ecc734668" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Nov 25 16:15:47 crc kubenswrapper[4743]: I1125 16:15:47.545434 4743 generic.go:334] "Generic (PLEG): container finished" podID="ad5e0061-9982-4b48-b8f3-877ecc734668" containerID="5209329b204f2ec2f717266243f9212ac731a17bd362f2758a97567e6a836f26" exitCode=0 Nov 25 16:15:47 crc kubenswrapper[4743]: I1125 16:15:47.545481 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-q4m8b" event={"ID":"ad5e0061-9982-4b48-b8f3-877ecc734668","Type":"ContainerDied","Data":"5209329b204f2ec2f717266243f9212ac731a17bd362f2758a97567e6a836f26"} Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.457440 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.797817 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f686449-cqrx5"] Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.830074 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-66f797f6cb-zd4ck"] Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.831836 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.836993 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.841454 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66f797f6cb-zd4ck"] Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.900377 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84d446b49-dttj9"] Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.914891 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7495cddcb-ghpkx"] Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.918289 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.925385 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7495cddcb-ghpkx"] Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.972550 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-horizon-secret-key\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.972628 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7527\" (UniqueName: \"kubernetes.io/projected/65514eee-0e20-40f2-b381-21311ae5e899-kube-api-access-x7527\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.972655 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-combined-ca-bundle\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.972806 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65514eee-0e20-40f2-b381-21311ae5e899-scripts\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.972846 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65514eee-0e20-40f2-b381-21311ae5e899-logs\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.972881 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-horizon-tls-certs\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:48 crc kubenswrapper[4743]: I1125 16:15:48.972923 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65514eee-0e20-40f2-b381-21311ae5e899-config-data\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.074528 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7527\" (UniqueName: \"kubernetes.io/projected/65514eee-0e20-40f2-b381-21311ae5e899-kube-api-access-x7527\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.074584 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1e54ceb1-969a-4172-9928-7e424dd38b5b-horizon-secret-key\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.074641 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-combined-ca-bundle\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.074665 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e54ceb1-969a-4172-9928-7e424dd38b5b-logs\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.074712 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e54ceb1-969a-4172-9928-7e424dd38b5b-config-data\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.074758 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65514eee-0e20-40f2-b381-21311ae5e899-scripts\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.074782 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65514eee-0e20-40f2-b381-21311ae5e899-logs\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.074817 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e54ceb1-969a-4172-9928-7e424dd38b5b-combined-ca-bundle\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.074840 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-horizon-tls-certs\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.074868 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdcc6\" (UniqueName: \"kubernetes.io/projected/1e54ceb1-969a-4172-9928-7e424dd38b5b-kube-api-access-tdcc6\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.074897 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65514eee-0e20-40f2-b381-21311ae5e899-config-data\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.074992 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e54ceb1-969a-4172-9928-7e424dd38b5b-horizon-tls-certs\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.075018 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e54ceb1-969a-4172-9928-7e424dd38b5b-scripts\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.075049 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-horizon-secret-key\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.075753 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65514eee-0e20-40f2-b381-21311ae5e899-scripts\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.075929 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65514eee-0e20-40f2-b381-21311ae5e899-logs\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.076990 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65514eee-0e20-40f2-b381-21311ae5e899-config-data\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.081583 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-horizon-secret-key\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.081725 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-horizon-tls-certs\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.082246 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-combined-ca-bundle\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.098136 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7527\" (UniqueName: \"kubernetes.io/projected/65514eee-0e20-40f2-b381-21311ae5e899-kube-api-access-x7527\") pod \"horizon-66f797f6cb-zd4ck\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.172406 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.176580 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e54ceb1-969a-4172-9928-7e424dd38b5b-config-data\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.176733 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e54ceb1-969a-4172-9928-7e424dd38b5b-combined-ca-bundle\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.176825 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdcc6\" (UniqueName: \"kubernetes.io/projected/1e54ceb1-969a-4172-9928-7e424dd38b5b-kube-api-access-tdcc6\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.176972 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e54ceb1-969a-4172-9928-7e424dd38b5b-horizon-tls-certs\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.177065 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e54ceb1-969a-4172-9928-7e424dd38b5b-scripts\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.177163 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1e54ceb1-969a-4172-9928-7e424dd38b5b-horizon-secret-key\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.177261 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e54ceb1-969a-4172-9928-7e424dd38b5b-logs\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.177545 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e54ceb1-969a-4172-9928-7e424dd38b5b-logs\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.177763 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e54ceb1-969a-4172-9928-7e424dd38b5b-scripts\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.178109 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e54ceb1-969a-4172-9928-7e424dd38b5b-config-data\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.180184 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e54ceb1-969a-4172-9928-7e424dd38b5b-combined-ca-bundle\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.180645 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e54ceb1-969a-4172-9928-7e424dd38b5b-horizon-tls-certs\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.180799 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1e54ceb1-969a-4172-9928-7e424dd38b5b-horizon-secret-key\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.195273 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdcc6\" (UniqueName: \"kubernetes.io/projected/1e54ceb1-969a-4172-9928-7e424dd38b5b-kube-api-access-tdcc6\") pod \"horizon-7495cddcb-ghpkx\" (UID: \"1e54ceb1-969a-4172-9928-7e424dd38b5b\") " pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:49 crc kubenswrapper[4743]: I1125 16:15:49.243320 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:15:52 crc kubenswrapper[4743]: I1125 16:15:52.284508 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-q4m8b" podUID="ad5e0061-9982-4b48-b8f3-877ecc734668" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.385194 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.553320 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kftdb\" (UniqueName: \"kubernetes.io/projected/97694e36-d7dd-42e8-85bc-3aa41ccb6364-kube-api-access-kftdb\") pod \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.553490 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.553647 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-combined-ca-bundle\") pod \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.553731 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-scripts\") pod \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.553770 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-config-data\") pod \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.553834 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97694e36-d7dd-42e8-85bc-3aa41ccb6364-logs\") pod \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.553925 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97694e36-d7dd-42e8-85bc-3aa41ccb6364-httpd-run\") pod \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\" (UID: \"97694e36-d7dd-42e8-85bc-3aa41ccb6364\") " Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.554679 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97694e36-d7dd-42e8-85bc-3aa41ccb6364-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "97694e36-d7dd-42e8-85bc-3aa41ccb6364" (UID: "97694e36-d7dd-42e8-85bc-3aa41ccb6364"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.554891 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97694e36-d7dd-42e8-85bc-3aa41ccb6364-logs" (OuterVolumeSpecName: "logs") pod "97694e36-d7dd-42e8-85bc-3aa41ccb6364" (UID: "97694e36-d7dd-42e8-85bc-3aa41ccb6364"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.555272 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97694e36-d7dd-42e8-85bc-3aa41ccb6364-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.555297 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97694e36-d7dd-42e8-85bc-3aa41ccb6364-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.560746 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97694e36-d7dd-42e8-85bc-3aa41ccb6364-kube-api-access-kftdb" (OuterVolumeSpecName: "kube-api-access-kftdb") pod "97694e36-d7dd-42e8-85bc-3aa41ccb6364" (UID: "97694e36-d7dd-42e8-85bc-3aa41ccb6364"). InnerVolumeSpecName "kube-api-access-kftdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.564732 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-scripts" (OuterVolumeSpecName: "scripts") pod "97694e36-d7dd-42e8-85bc-3aa41ccb6364" (UID: "97694e36-d7dd-42e8-85bc-3aa41ccb6364"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.566141 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "97694e36-d7dd-42e8-85bc-3aa41ccb6364" (UID: "97694e36-d7dd-42e8-85bc-3aa41ccb6364"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.595447 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97694e36-d7dd-42e8-85bc-3aa41ccb6364" (UID: "97694e36-d7dd-42e8-85bc-3aa41ccb6364"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.608049 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97694e36-d7dd-42e8-85bc-3aa41ccb6364","Type":"ContainerDied","Data":"0ea61c58739f5191ad68be584e524fe0e9c3a67ce195ac847988cff6b48a6d66"} Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.608113 4743 scope.go:117] "RemoveContainer" containerID="9b4b6267173eb5e87431cb9bfa2b29adcf4cfd13bcb37c35bb732862b8c0f5e8" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.608299 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.626534 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-config-data" (OuterVolumeSpecName: "config-data") pod "97694e36-d7dd-42e8-85bc-3aa41ccb6364" (UID: "97694e36-d7dd-42e8-85bc-3aa41ccb6364"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.657212 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kftdb\" (UniqueName: \"kubernetes.io/projected/97694e36-d7dd-42e8-85bc-3aa41ccb6364-kube-api-access-kftdb\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.657277 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.657429 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.657439 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.657452 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97694e36-d7dd-42e8-85bc-3aa41ccb6364-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.683107 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 25 16:15:53 crc kubenswrapper[4743]: E1125 16:15:53.757381 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Nov 25 16:15:53 crc kubenswrapper[4743]: E1125 16:15:53.758736 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbbh658h5dhc8h645h668h89h64bh5d4h5bfh5d4hd9h54dhfh679h5ch87h5chd5hcch68h86hd9h7dh559hfch76hffh87h6ch77h65fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kdp9z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-84d446b49-dttj9_openstack(2f65e718-4161-401d-88fd-50c615ce803b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.759835 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 25 16:15:53 crc kubenswrapper[4743]: E1125 16:15:53.798761 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-84d446b49-dttj9" podUID="2f65e718-4161-401d-88fd-50c615ce803b" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.944764 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.956050 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.969793 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 16:15:53 crc kubenswrapper[4743]: E1125 16:15:53.970555 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97694e36-d7dd-42e8-85bc-3aa41ccb6364" containerName="glance-httpd" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.970583 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="97694e36-d7dd-42e8-85bc-3aa41ccb6364" containerName="glance-httpd" Nov 25 16:15:53 crc kubenswrapper[4743]: E1125 16:15:53.970637 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97694e36-d7dd-42e8-85bc-3aa41ccb6364" containerName="glance-log" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.970650 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="97694e36-d7dd-42e8-85bc-3aa41ccb6364" containerName="glance-log" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.970930 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="97694e36-d7dd-42e8-85bc-3aa41ccb6364" containerName="glance-httpd" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.970969 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="97694e36-d7dd-42e8-85bc-3aa41ccb6364" containerName="glance-log" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.972788 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.976689 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.976688 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 25 16:15:53 crc kubenswrapper[4743]: I1125 16:15:53.980146 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.073935 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f117987a-97b8-4338-a8e2-0e298028faab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.074241 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f117987a-97b8-4338-a8e2-0e298028faab-logs\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.074326 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.074745 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.074821 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgwhg\" (UniqueName: \"kubernetes.io/projected/f117987a-97b8-4338-a8e2-0e298028faab-kube-api-access-mgwhg\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.074939 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.074982 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.075152 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.177631 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.178414 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgwhg\" (UniqueName: \"kubernetes.io/projected/f117987a-97b8-4338-a8e2-0e298028faab-kube-api-access-mgwhg\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.178562 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.178743 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.178858 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.179024 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f117987a-97b8-4338-a8e2-0e298028faab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.181177 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f117987a-97b8-4338-a8e2-0e298028faab-logs\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.181224 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.181810 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.183657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f117987a-97b8-4338-a8e2-0e298028faab-logs\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.183818 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.183852 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.183888 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f117987a-97b8-4338-a8e2-0e298028faab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.185063 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.185165 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.198750 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgwhg\" (UniqueName: \"kubernetes.io/projected/f117987a-97b8-4338-a8e2-0e298028faab-kube-api-access-mgwhg\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.221353 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:15:54 crc kubenswrapper[4743]: I1125 16:15:54.308210 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 16:15:55 crc kubenswrapper[4743]: I1125 16:15:55.786675 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97694e36-d7dd-42e8-85bc-3aa41ccb6364" path="/var/lib/kubelet/pods/97694e36-d7dd-42e8-85bc-3aa41ccb6364/volumes" Nov 25 16:15:57 crc kubenswrapper[4743]: I1125 16:15:57.284914 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-q4m8b" podUID="ad5e0061-9982-4b48-b8f3-877ecc734668" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Nov 25 16:15:57 crc kubenswrapper[4743]: I1125 16:15:57.285076 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.284489 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-q4m8b" podUID="ad5e0061-9982-4b48-b8f3-877ecc734668" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.515124 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.641146 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-fernet-keys\") pod \"abc6c25e-56eb-438a-b332-fc7f730c33d3\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.641289 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-config-data\") pod \"abc6c25e-56eb-438a-b332-fc7f730c33d3\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.641313 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-credential-keys\") pod \"abc6c25e-56eb-438a-b332-fc7f730c33d3\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.642412 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w77wg\" (UniqueName: \"kubernetes.io/projected/abc6c25e-56eb-438a-b332-fc7f730c33d3-kube-api-access-w77wg\") pod \"abc6c25e-56eb-438a-b332-fc7f730c33d3\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.642488 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-combined-ca-bundle\") pod \"abc6c25e-56eb-438a-b332-fc7f730c33d3\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.642527 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-scripts\") pod \"abc6c25e-56eb-438a-b332-fc7f730c33d3\" (UID: \"abc6c25e-56eb-438a-b332-fc7f730c33d3\") " Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.647508 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "abc6c25e-56eb-438a-b332-fc7f730c33d3" (UID: "abc6c25e-56eb-438a-b332-fc7f730c33d3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.647528 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "abc6c25e-56eb-438a-b332-fc7f730c33d3" (UID: "abc6c25e-56eb-438a-b332-fc7f730c33d3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.647779 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abc6c25e-56eb-438a-b332-fc7f730c33d3-kube-api-access-w77wg" (OuterVolumeSpecName: "kube-api-access-w77wg") pod "abc6c25e-56eb-438a-b332-fc7f730c33d3" (UID: "abc6c25e-56eb-438a-b332-fc7f730c33d3"). InnerVolumeSpecName "kube-api-access-w77wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.647914 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-scripts" (OuterVolumeSpecName: "scripts") pod "abc6c25e-56eb-438a-b332-fc7f730c33d3" (UID: "abc6c25e-56eb-438a-b332-fc7f730c33d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.670784 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-config-data" (OuterVolumeSpecName: "config-data") pod "abc6c25e-56eb-438a-b332-fc7f730c33d3" (UID: "abc6c25e-56eb-438a-b332-fc7f730c33d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.680816 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abc6c25e-56eb-438a-b332-fc7f730c33d3" (UID: "abc6c25e-56eb-438a-b332-fc7f730c33d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.689217 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-56pnd" event={"ID":"abc6c25e-56eb-438a-b332-fc7f730c33d3","Type":"ContainerDied","Data":"084bc4770ca685ecb9cdd01ea1cf180cfbcff2c8b6f47a8ed323ee1daed401fe"} Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.689263 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="084bc4770ca685ecb9cdd01ea1cf180cfbcff2c8b6f47a8ed323ee1daed401fe" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.689263 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-56pnd" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.744577 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w77wg\" (UniqueName: \"kubernetes.io/projected/abc6c25e-56eb-438a-b332-fc7f730c33d3-kube-api-access-w77wg\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.744633 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.744645 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.744653 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.744664 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:02 crc kubenswrapper[4743]: I1125 16:16:02.744672 4743 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abc6c25e-56eb-438a-b332-fc7f730c33d3-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.594676 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-56pnd"] Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.601442 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-56pnd"] Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.699109 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qx98x"] Nov 25 16:16:03 crc kubenswrapper[4743]: E1125 16:16:03.699580 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc6c25e-56eb-438a-b332-fc7f730c33d3" containerName="keystone-bootstrap" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.699605 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc6c25e-56eb-438a-b332-fc7f730c33d3" containerName="keystone-bootstrap" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.699772 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="abc6c25e-56eb-438a-b332-fc7f730c33d3" containerName="keystone-bootstrap" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.702676 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.707241 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.707444 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-znjss" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.707668 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.709818 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.709827 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.720464 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qx98x"] Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.784998 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abc6c25e-56eb-438a-b332-fc7f730c33d3" path="/var/lib/kubelet/pods/abc6c25e-56eb-438a-b332-fc7f730c33d3/volumes" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.865652 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-scripts\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.865724 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-fernet-keys\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.865762 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-config-data\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.865859 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-credential-keys\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.865882 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89k8l\" (UniqueName: \"kubernetes.io/projected/7bd5b52c-e448-4f66-ab08-4c24f541412d-kube-api-access-89k8l\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.865985 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-combined-ca-bundle\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.967935 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-scripts\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.967997 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-fernet-keys\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.968028 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-config-data\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.968095 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-credential-keys\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.968114 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89k8l\" (UniqueName: \"kubernetes.io/projected/7bd5b52c-e448-4f66-ab08-4c24f541412d-kube-api-access-89k8l\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.968189 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-combined-ca-bundle\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.973000 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-combined-ca-bundle\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.973548 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-credential-keys\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.973572 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-scripts\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.973973 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-fernet-keys\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.974144 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-config-data\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:03 crc kubenswrapper[4743]: I1125 16:16:03.985486 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89k8l\" (UniqueName: \"kubernetes.io/projected/7bd5b52c-e448-4f66-ab08-4c24f541412d-kube-api-access-89k8l\") pod \"keystone-bootstrap-qx98x\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:04 crc kubenswrapper[4743]: I1125 16:16:04.020331 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:04 crc kubenswrapper[4743]: E1125 16:16:04.063676 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Nov 25 16:16:04 crc kubenswrapper[4743]: E1125 16:16:04.064084 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n546h569hc4h646h657hf7h598h5d4hfdh549h5b9h659h668h545hfhc6h5c4h87hf6h5c4h547h677h667h56bh547h668h5fdhd9h567h689h685h56fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l8rq4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6b7cc8fb89-4nbgj_openstack(c0eccb42-7531-45d7-9b61-670d338bf6c1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 16:16:04 crc kubenswrapper[4743]: E1125 16:16:04.067029 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6b7cc8fb89-4nbgj" podUID="c0eccb42-7531-45d7-9b61-670d338bf6c1" Nov 25 16:16:04 crc kubenswrapper[4743]: E1125 16:16:04.080695 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Nov 25 16:16:04 crc kubenswrapper[4743]: E1125 16:16:04.080966 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n64dhb7h68dh594hf6h5cdh5d4h59h5fhbch589h66bhbdh557hbh5dfh5f7h7fh564hf6h88h85h78h8dh55bh84h5bch64fh599h66bh4h95q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-trqlf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7f686449-cqrx5_openstack(dd2b63ac-91fc-49e7-8d5e-05a04879baba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 16:16:04 crc kubenswrapper[4743]: E1125 16:16:04.083131 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7f686449-cqrx5" podUID="dd2b63ac-91fc-49e7-8d5e-05a04879baba" Nov 25 16:16:05 crc kubenswrapper[4743]: E1125 16:16:05.292711 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Nov 25 16:16:05 crc kubenswrapper[4743]: E1125 16:16:05.293199 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2f2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-2lqmd_openstack(5a037faa-f5b9-4abb-8132-2750befdf031): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 16:16:05 crc kubenswrapper[4743]: E1125 16:16:05.294312 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-2lqmd" podUID="5a037faa-f5b9-4abb-8132-2750befdf031" Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.386528 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.494689 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f65e718-4161-401d-88fd-50c615ce803b-config-data\") pod \"2f65e718-4161-401d-88fd-50c615ce803b\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.494750 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f65e718-4161-401d-88fd-50c615ce803b-scripts\") pod \"2f65e718-4161-401d-88fd-50c615ce803b\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.495544 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f65e718-4161-401d-88fd-50c615ce803b-scripts" (OuterVolumeSpecName: "scripts") pod "2f65e718-4161-401d-88fd-50c615ce803b" (UID: "2f65e718-4161-401d-88fd-50c615ce803b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.495639 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f65e718-4161-401d-88fd-50c615ce803b-logs\") pod \"2f65e718-4161-401d-88fd-50c615ce803b\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.495868 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdp9z\" (UniqueName: \"kubernetes.io/projected/2f65e718-4161-401d-88fd-50c615ce803b-kube-api-access-kdp9z\") pod \"2f65e718-4161-401d-88fd-50c615ce803b\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.495931 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2f65e718-4161-401d-88fd-50c615ce803b-horizon-secret-key\") pod \"2f65e718-4161-401d-88fd-50c615ce803b\" (UID: \"2f65e718-4161-401d-88fd-50c615ce803b\") " Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.495650 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f65e718-4161-401d-88fd-50c615ce803b-config-data" (OuterVolumeSpecName: "config-data") pod "2f65e718-4161-401d-88fd-50c615ce803b" (UID: "2f65e718-4161-401d-88fd-50c615ce803b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.495905 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f65e718-4161-401d-88fd-50c615ce803b-logs" (OuterVolumeSpecName: "logs") pod "2f65e718-4161-401d-88fd-50c615ce803b" (UID: "2f65e718-4161-401d-88fd-50c615ce803b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.496761 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f65e718-4161-401d-88fd-50c615ce803b-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.496791 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f65e718-4161-401d-88fd-50c615ce803b-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.496801 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f65e718-4161-401d-88fd-50c615ce803b-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.502763 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f65e718-4161-401d-88fd-50c615ce803b-kube-api-access-kdp9z" (OuterVolumeSpecName: "kube-api-access-kdp9z") pod "2f65e718-4161-401d-88fd-50c615ce803b" (UID: "2f65e718-4161-401d-88fd-50c615ce803b"). InnerVolumeSpecName "kube-api-access-kdp9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.503109 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f65e718-4161-401d-88fd-50c615ce803b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2f65e718-4161-401d-88fd-50c615ce803b" (UID: "2f65e718-4161-401d-88fd-50c615ce803b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.599099 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdp9z\" (UniqueName: \"kubernetes.io/projected/2f65e718-4161-401d-88fd-50c615ce803b-kube-api-access-kdp9z\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.599137 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2f65e718-4161-401d-88fd-50c615ce803b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.728380 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84d446b49-dttj9" event={"ID":"2f65e718-4161-401d-88fd-50c615ce803b","Type":"ContainerDied","Data":"eef7a2bf495de7c23b0047f8d720e7ada75ff73c6bb565ee7d0505da5eb0f043"} Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.728412 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84d446b49-dttj9" Nov 25 16:16:05 crc kubenswrapper[4743]: E1125 16:16:05.732625 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-2lqmd" podUID="5a037faa-f5b9-4abb-8132-2750befdf031" Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.794434 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84d446b49-dttj9"] Nov 25 16:16:05 crc kubenswrapper[4743]: I1125 16:16:05.810281 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-84d446b49-dttj9"] Nov 25 16:16:06 crc kubenswrapper[4743]: E1125 16:16:06.384852 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Nov 25 16:16:06 crc kubenswrapper[4743]: E1125 16:16:06.385016 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rmhln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9vplz_openstack(0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 16:16:06 crc kubenswrapper[4743]: E1125 16:16:06.386217 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-9vplz" podUID="0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba" Nov 25 16:16:06 crc kubenswrapper[4743]: E1125 16:16:06.743286 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-9vplz" podUID="0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba" Nov 25 16:16:06 crc kubenswrapper[4743]: E1125 16:16:06.782582 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Nov 25 16:16:06 crc kubenswrapper[4743]: E1125 16:16:06.782752 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bcrlp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-mqvnl_openstack(92e09359-debb-49f3-8490-c18e8ca5f63e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 16:16:06 crc kubenswrapper[4743]: E1125 16:16:06.784205 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-mqvnl" podUID="92e09359-debb-49f3-8490-c18e8ca5f63e" Nov 25 16:16:06 crc kubenswrapper[4743]: I1125 16:16:06.822994 4743 scope.go:117] "RemoveContainer" containerID="d999381c5b7ec9406b6f8bf5a3ac48472f5e9ee7112e1b2edfb2ab3164b57425" Nov 25 16:16:06 crc kubenswrapper[4743]: I1125 16:16:06.971947 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:16:06 crc kubenswrapper[4743]: I1125 16:16:06.973565 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:16:06 crc kubenswrapper[4743]: I1125 16:16:06.998465 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.022214 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0eccb42-7531-45d7-9b61-670d338bf6c1-scripts\") pod \"c0eccb42-7531-45d7-9b61-670d338bf6c1\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.022872 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0eccb42-7531-45d7-9b61-670d338bf6c1-scripts" (OuterVolumeSpecName: "scripts") pod "c0eccb42-7531-45d7-9b61-670d338bf6c1" (UID: "c0eccb42-7531-45d7-9b61-670d338bf6c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.023383 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd2b63ac-91fc-49e7-8d5e-05a04879baba-scripts" (OuterVolumeSpecName: "scripts") pod "dd2b63ac-91fc-49e7-8d5e-05a04879baba" (UID: "dd2b63ac-91fc-49e7-8d5e-05a04879baba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.022254 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd2b63ac-91fc-49e7-8d5e-05a04879baba-scripts\") pod \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.028005 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0eccb42-7531-45d7-9b61-670d338bf6c1-horizon-secret-key\") pod \"c0eccb42-7531-45d7-9b61-670d338bf6c1\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.028075 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd2b63ac-91fc-49e7-8d5e-05a04879baba-logs\") pod \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.028102 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd2b63ac-91fc-49e7-8d5e-05a04879baba-horizon-secret-key\") pod \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.028244 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eccb42-7531-45d7-9b61-670d338bf6c1-logs\") pod \"c0eccb42-7531-45d7-9b61-670d338bf6c1\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.028283 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd2b63ac-91fc-49e7-8d5e-05a04879baba-config-data\") pod \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.028336 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0eccb42-7531-45d7-9b61-670d338bf6c1-config-data\") pod \"c0eccb42-7531-45d7-9b61-670d338bf6c1\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.028385 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8rq4\" (UniqueName: \"kubernetes.io/projected/c0eccb42-7531-45d7-9b61-670d338bf6c1-kube-api-access-l8rq4\") pod \"c0eccb42-7531-45d7-9b61-670d338bf6c1\" (UID: \"c0eccb42-7531-45d7-9b61-670d338bf6c1\") " Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.028421 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trqlf\" (UniqueName: \"kubernetes.io/projected/dd2b63ac-91fc-49e7-8d5e-05a04879baba-kube-api-access-trqlf\") pod \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\" (UID: \"dd2b63ac-91fc-49e7-8d5e-05a04879baba\") " Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.030011 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0eccb42-7531-45d7-9b61-670d338bf6c1-logs" (OuterVolumeSpecName: "logs") pod "c0eccb42-7531-45d7-9b61-670d338bf6c1" (UID: "c0eccb42-7531-45d7-9b61-670d338bf6c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.030746 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd2b63ac-91fc-49e7-8d5e-05a04879baba-logs" (OuterVolumeSpecName: "logs") pod "dd2b63ac-91fc-49e7-8d5e-05a04879baba" (UID: "dd2b63ac-91fc-49e7-8d5e-05a04879baba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.035044 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2b63ac-91fc-49e7-8d5e-05a04879baba-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dd2b63ac-91fc-49e7-8d5e-05a04879baba" (UID: "dd2b63ac-91fc-49e7-8d5e-05a04879baba"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.035990 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0eccb42-7531-45d7-9b61-670d338bf6c1-config-data" (OuterVolumeSpecName: "config-data") pod "c0eccb42-7531-45d7-9b61-670d338bf6c1" (UID: "c0eccb42-7531-45d7-9b61-670d338bf6c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.039065 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd2b63ac-91fc-49e7-8d5e-05a04879baba-config-data" (OuterVolumeSpecName: "config-data") pod "dd2b63ac-91fc-49e7-8d5e-05a04879baba" (UID: "dd2b63ac-91fc-49e7-8d5e-05a04879baba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.045989 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0eccb42-7531-45d7-9b61-670d338bf6c1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c0eccb42-7531-45d7-9b61-670d338bf6c1" (UID: "c0eccb42-7531-45d7-9b61-670d338bf6c1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.048179 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2b63ac-91fc-49e7-8d5e-05a04879baba-kube-api-access-trqlf" (OuterVolumeSpecName: "kube-api-access-trqlf") pod "dd2b63ac-91fc-49e7-8d5e-05a04879baba" (UID: "dd2b63ac-91fc-49e7-8d5e-05a04879baba"). InnerVolumeSpecName "kube-api-access-trqlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.055950 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0eccb42-7531-45d7-9b61-670d338bf6c1-kube-api-access-l8rq4" (OuterVolumeSpecName: "kube-api-access-l8rq4") pod "c0eccb42-7531-45d7-9b61-670d338bf6c1" (UID: "c0eccb42-7531-45d7-9b61-670d338bf6c1"). InnerVolumeSpecName "kube-api-access-l8rq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.057351 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8rq4\" (UniqueName: \"kubernetes.io/projected/c0eccb42-7531-45d7-9b61-670d338bf6c1-kube-api-access-l8rq4\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.057379 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trqlf\" (UniqueName: \"kubernetes.io/projected/dd2b63ac-91fc-49e7-8d5e-05a04879baba-kube-api-access-trqlf\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.057388 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0eccb42-7531-45d7-9b61-670d338bf6c1-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.057396 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd2b63ac-91fc-49e7-8d5e-05a04879baba-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.057405 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0eccb42-7531-45d7-9b61-670d338bf6c1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.057413 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd2b63ac-91fc-49e7-8d5e-05a04879baba-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.057422 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd2b63ac-91fc-49e7-8d5e-05a04879baba-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.057429 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eccb42-7531-45d7-9b61-670d338bf6c1-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.057437 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd2b63ac-91fc-49e7-8d5e-05a04879baba-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.057445 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0eccb42-7531-45d7-9b61-670d338bf6c1-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.158030 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-ovsdbserver-sb\") pod \"ad5e0061-9982-4b48-b8f3-877ecc734668\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.158110 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-dns-svc\") pod \"ad5e0061-9982-4b48-b8f3-877ecc734668\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.158299 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfrx8\" (UniqueName: \"kubernetes.io/projected/ad5e0061-9982-4b48-b8f3-877ecc734668-kube-api-access-qfrx8\") pod \"ad5e0061-9982-4b48-b8f3-877ecc734668\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.159538 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-config\") pod \"ad5e0061-9982-4b48-b8f3-877ecc734668\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.159564 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-ovsdbserver-nb\") pod \"ad5e0061-9982-4b48-b8f3-877ecc734668\" (UID: \"ad5e0061-9982-4b48-b8f3-877ecc734668\") " Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.163098 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5e0061-9982-4b48-b8f3-877ecc734668-kube-api-access-qfrx8" (OuterVolumeSpecName: "kube-api-access-qfrx8") pod "ad5e0061-9982-4b48-b8f3-877ecc734668" (UID: "ad5e0061-9982-4b48-b8f3-877ecc734668"). InnerVolumeSpecName "kube-api-access-qfrx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.213325 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad5e0061-9982-4b48-b8f3-877ecc734668" (UID: "ad5e0061-9982-4b48-b8f3-877ecc734668"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.219710 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ad5e0061-9982-4b48-b8f3-877ecc734668" (UID: "ad5e0061-9982-4b48-b8f3-877ecc734668"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.224119 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ad5e0061-9982-4b48-b8f3-877ecc734668" (UID: "ad5e0061-9982-4b48-b8f3-877ecc734668"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.230441 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-config" (OuterVolumeSpecName: "config") pod "ad5e0061-9982-4b48-b8f3-877ecc734668" (UID: "ad5e0061-9982-4b48-b8f3-877ecc734668"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.262030 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfrx8\" (UniqueName: \"kubernetes.io/projected/ad5e0061-9982-4b48-b8f3-877ecc734668-kube-api-access-qfrx8\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.262060 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.262070 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.262079 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.262088 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad5e0061-9982-4b48-b8f3-877ecc734668-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.343370 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7495cddcb-ghpkx"] Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.362801 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66f797f6cb-zd4ck"] Nov 25 16:16:07 crc kubenswrapper[4743]: W1125 16:16:07.365155 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e54ceb1_969a_4172_9928_7e424dd38b5b.slice/crio-05f8703d480095370b2a18bd90465b85d17bd670d1d0b548766d272b42de7ef5 WatchSource:0}: Error finding container 05f8703d480095370b2a18bd90465b85d17bd670d1d0b548766d272b42de7ef5: Status 404 returned error can't find the container with id 05f8703d480095370b2a18bd90465b85d17bd670d1d0b548766d272b42de7ef5 Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.424381 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 16:16:07 crc kubenswrapper[4743]: W1125 16:16:07.427743 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf117987a_97b8_4338_a8e2_0e298028faab.slice/crio-35ded1038fd75dbddd18e74dcd9113dbb0a6d981b65bb13e10a6f868b6d5d48c WatchSource:0}: Error finding container 35ded1038fd75dbddd18e74dcd9113dbb0a6d981b65bb13e10a6f868b6d5d48c: Status 404 returned error can't find the container with id 35ded1038fd75dbddd18e74dcd9113dbb0a6d981b65bb13e10a6f868b6d5d48c Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.473953 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qx98x"] Nov 25 16:16:07 crc kubenswrapper[4743]: W1125 16:16:07.477990 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bd5b52c_e448_4f66_ab08_4c24f541412d.slice/crio-e84893c1f8eb89ab9951a5c1d148f5783ae8a456496e48bed3e352f5d52c7ee3 WatchSource:0}: Error finding container e84893c1f8eb89ab9951a5c1d148f5783ae8a456496e48bed3e352f5d52c7ee3: Status 404 returned error can't find the container with id e84893c1f8eb89ab9951a5c1d148f5783ae8a456496e48bed3e352f5d52c7ee3 Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.752048 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f797f6cb-zd4ck" event={"ID":"65514eee-0e20-40f2-b381-21311ae5e899","Type":"ContainerStarted","Data":"e61616598c151dfc5c9a723acc1846c4b18879116a2d6d423af5551cdab8944d"} Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.759938 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1090d60-66a4-45b2-b37c-fca05d77a7c2","Type":"ContainerStarted","Data":"4f0da2d223788415cd89ad3d4f0a8e3ec1c0c924891e4d67b0c6c839f9bb917b"} Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.761530 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f117987a-97b8-4338-a8e2-0e298028faab","Type":"ContainerStarted","Data":"35ded1038fd75dbddd18e74dcd9113dbb0a6d981b65bb13e10a6f868b6d5d48c"} Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.763124 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f686449-cqrx5" event={"ID":"dd2b63ac-91fc-49e7-8d5e-05a04879baba","Type":"ContainerDied","Data":"9b72acd192156704f3a9226c59c34d6731dde1eeca0cc51267542649c301991a"} Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.763132 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f686449-cqrx5" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.764909 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qx98x" event={"ID":"7bd5b52c-e448-4f66-ab08-4c24f541412d","Type":"ContainerStarted","Data":"6304cc9db27c76879661d45840a3877910738087c85af9ed315c1a8a4bf37f0c"} Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.764946 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qx98x" event={"ID":"7bd5b52c-e448-4f66-ab08-4c24f541412d","Type":"ContainerStarted","Data":"e84893c1f8eb89ab9951a5c1d148f5783ae8a456496e48bed3e352f5d52c7ee3"} Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.768484 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b7cc8fb89-4nbgj" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.768486 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b7cc8fb89-4nbgj" event={"ID":"c0eccb42-7531-45d7-9b61-670d338bf6c1","Type":"ContainerDied","Data":"1441ccc229b6a7f34ad8f26634bde15e724b6e9df90a3f4a32dd13c9ea5d8c45"} Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.771348 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd","Type":"ContainerStarted","Data":"78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd"} Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.771426 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" containerName="glance-log" containerID="cri-o://5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5" gracePeriod=30 Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.771481 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" containerName="glance-httpd" containerID="cri-o://78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd" gracePeriod=30 Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.783538 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-q4m8b" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.783830 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qx98x" podStartSLOduration=4.7838181429999995 podStartE2EDuration="4.783818143s" podCreationTimestamp="2025-11-25 16:16:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:07.782688308 +0000 UTC m=+1046.904527867" watchObservedRunningTime="2025-11-25 16:16:07.783818143 +0000 UTC m=+1046.905657692" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.805855 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f65e718-4161-401d-88fd-50c615ce803b" path="/var/lib/kubelet/pods/2f65e718-4161-401d-88fd-50c615ce803b/volumes" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.810057 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-q4m8b" event={"ID":"ad5e0061-9982-4b48-b8f3-877ecc734668","Type":"ContainerDied","Data":"13b4a8922eae95b5807cca7537e75df4609e6364ac20faa5add1de3d3758c68c"} Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.810105 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7495cddcb-ghpkx" event={"ID":"1e54ceb1-969a-4172-9928-7e424dd38b5b","Type":"ContainerStarted","Data":"05f8703d480095370b2a18bd90465b85d17bd670d1d0b548766d272b42de7ef5"} Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.810141 4743 scope.go:117] "RemoveContainer" containerID="5209329b204f2ec2f717266243f9212ac731a17bd362f2758a97567e6a836f26" Nov 25 16:16:07 crc kubenswrapper[4743]: E1125 16:16:07.816899 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-mqvnl" podUID="92e09359-debb-49f3-8490-c18e8ca5f63e" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.817414 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=28.817384756 podStartE2EDuration="28.817384756s" podCreationTimestamp="2025-11-25 16:15:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:07.817226251 +0000 UTC m=+1046.939065820" watchObservedRunningTime="2025-11-25 16:16:07.817384756 +0000 UTC m=+1046.939224315" Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.924734 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f686449-cqrx5"] Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.936701 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7f686449-cqrx5"] Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.949643 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-q4m8b"] Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.964705 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-q4m8b"] Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.992631 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b7cc8fb89-4nbgj"] Nov 25 16:16:07 crc kubenswrapper[4743]: I1125 16:16:07.999661 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6b7cc8fb89-4nbgj"] Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.019833 4743 scope.go:117] "RemoveContainer" containerID="563b557dcd5b9df464916b719c0dbd176c28ce9c9a412b91a8c7f17d94700509" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.508385 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.605372 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-httpd-run\") pod \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.605411 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wmtz\" (UniqueName: \"kubernetes.io/projected/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-kube-api-access-4wmtz\") pod \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.605514 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-scripts\") pod \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.605554 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-logs\") pod \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.605609 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-config-data\") pod \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.605671 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.605706 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-combined-ca-bundle\") pod \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\" (UID: \"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd\") " Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.606752 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" (UID: "0f7e02f3-87ff-48fb-8431-5b4d830e3cbd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.606769 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-logs" (OuterVolumeSpecName: "logs") pod "0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" (UID: "0f7e02f3-87ff-48fb-8431-5b4d830e3cbd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.610105 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" (UID: "0f7e02f3-87ff-48fb-8431-5b4d830e3cbd"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.611686 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-scripts" (OuterVolumeSpecName: "scripts") pod "0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" (UID: "0f7e02f3-87ff-48fb-8431-5b4d830e3cbd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.611756 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-kube-api-access-4wmtz" (OuterVolumeSpecName: "kube-api-access-4wmtz") pod "0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" (UID: "0f7e02f3-87ff-48fb-8431-5b4d830e3cbd"). InnerVolumeSpecName "kube-api-access-4wmtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.640234 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" (UID: "0f7e02f3-87ff-48fb-8431-5b4d830e3cbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.696546 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-config-data" (OuterVolumeSpecName: "config-data") pod "0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" (UID: "0f7e02f3-87ff-48fb-8431-5b4d830e3cbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.707662 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.707698 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.707710 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.707746 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.707756 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.707769 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.707778 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wmtz\" (UniqueName: \"kubernetes.io/projected/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd-kube-api-access-4wmtz\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.739405 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.806505 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7495cddcb-ghpkx" event={"ID":"1e54ceb1-969a-4172-9928-7e424dd38b5b","Type":"ContainerStarted","Data":"63fc19e4277bbc6a7d7121d9cf8780c0f4551031cd15d7fdc3bf6cd7401ad345"} Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.806549 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7495cddcb-ghpkx" event={"ID":"1e54ceb1-969a-4172-9928-7e424dd38b5b","Type":"ContainerStarted","Data":"f5bd376dcf5c332441a08c46c2c16e3fc51279ac965b0f2fe5798d6354e80977"} Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.808720 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.815377 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1090d60-66a4-45b2-b37c-fca05d77a7c2","Type":"ContainerStarted","Data":"e07b1cb797bd7b8e627e801949d1ca9f975054d787e04f2628340ec9a250ce50"} Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.817294 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f117987a-97b8-4338-a8e2-0e298028faab","Type":"ContainerStarted","Data":"3f32e61e0bad49765dd62176b4d938c99b52e3fe309342ba5c89cac80eb5c22b"} Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.838166 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7495cddcb-ghpkx" podStartSLOduration=20.358765754 podStartE2EDuration="20.838138555s" podCreationTimestamp="2025-11-25 16:15:48 +0000 UTC" firstStartedPulling="2025-11-25 16:16:07.370180076 +0000 UTC m=+1046.492019625" lastFinishedPulling="2025-11-25 16:16:07.849552877 +0000 UTC m=+1046.971392426" observedRunningTime="2025-11-25 16:16:08.82748874 +0000 UTC m=+1047.949328319" watchObservedRunningTime="2025-11-25 16:16:08.838138555 +0000 UTC m=+1047.959978104" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.838325 4743 generic.go:334] "Generic (PLEG): container finished" podID="0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" containerID="78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd" exitCode=0 Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.838361 4743 generic.go:334] "Generic (PLEG): container finished" podID="0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" containerID="5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5" exitCode=143 Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.838435 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.838438 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd","Type":"ContainerDied","Data":"78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd"} Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.838616 4743 scope.go:117] "RemoveContainer" containerID="78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.839741 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd","Type":"ContainerDied","Data":"5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5"} Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.839777 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0f7e02f3-87ff-48fb-8431-5b4d830e3cbd","Type":"ContainerDied","Data":"0e935af4ff25bf7e2b902218ffbf22b4297a84b3e1e3c86b9bd6cf70b1484b5b"} Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.842836 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f797f6cb-zd4ck" event={"ID":"65514eee-0e20-40f2-b381-21311ae5e899","Type":"ContainerStarted","Data":"c7efc3bf5ec0a3a745508cfbeedbf2a676ec3389b99cbb20a3ba37972d4920be"} Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.843280 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f797f6cb-zd4ck" event={"ID":"65514eee-0e20-40f2-b381-21311ae5e899","Type":"ContainerStarted","Data":"7f692b888d980122a2a3b6fba9b74f9dbc88be7dfc5a20925b9cd7fd1dea23a9"} Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.883494 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-66f797f6cb-zd4ck" podStartSLOduration=20.40894016 podStartE2EDuration="20.883452837s" podCreationTimestamp="2025-11-25 16:15:48 +0000 UTC" firstStartedPulling="2025-11-25 16:16:07.378869299 +0000 UTC m=+1046.500708848" lastFinishedPulling="2025-11-25 16:16:07.853381966 +0000 UTC m=+1046.975221525" observedRunningTime="2025-11-25 16:16:08.870745008 +0000 UTC m=+1047.992584587" watchObservedRunningTime="2025-11-25 16:16:08.883452837 +0000 UTC m=+1048.005292406" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.890479 4743 scope.go:117] "RemoveContainer" containerID="5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.893429 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.906122 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.920727 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:16:08 crc kubenswrapper[4743]: E1125 16:16:08.921183 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" containerName="glance-log" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.921204 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" containerName="glance-log" Nov 25 16:16:08 crc kubenswrapper[4743]: E1125 16:16:08.921227 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5e0061-9982-4b48-b8f3-877ecc734668" containerName="dnsmasq-dns" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.921233 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5e0061-9982-4b48-b8f3-877ecc734668" containerName="dnsmasq-dns" Nov 25 16:16:08 crc kubenswrapper[4743]: E1125 16:16:08.921246 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" containerName="glance-httpd" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.921252 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" containerName="glance-httpd" Nov 25 16:16:08 crc kubenswrapper[4743]: E1125 16:16:08.921266 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5e0061-9982-4b48-b8f3-877ecc734668" containerName="init" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.921272 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5e0061-9982-4b48-b8f3-877ecc734668" containerName="init" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.921428 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" containerName="glance-log" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.921444 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad5e0061-9982-4b48-b8f3-877ecc734668" containerName="dnsmasq-dns" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.921453 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" containerName="glance-httpd" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.922372 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.924756 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.926368 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.929684 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.943665 4743 scope.go:117] "RemoveContainer" containerID="78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd" Nov 25 16:16:08 crc kubenswrapper[4743]: E1125 16:16:08.947675 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd\": container with ID starting with 78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd not found: ID does not exist" containerID="78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.947732 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd"} err="failed to get container status \"78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd\": rpc error: code = NotFound desc = could not find container \"78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd\": container with ID starting with 78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd not found: ID does not exist" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.947768 4743 scope.go:117] "RemoveContainer" containerID="5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5" Nov 25 16:16:08 crc kubenswrapper[4743]: E1125 16:16:08.952700 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5\": container with ID starting with 5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5 not found: ID does not exist" containerID="5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.952748 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5"} err="failed to get container status \"5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5\": rpc error: code = NotFound desc = could not find container \"5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5\": container with ID starting with 5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5 not found: ID does not exist" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.952787 4743 scope.go:117] "RemoveContainer" containerID="78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.953222 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd"} err="failed to get container status \"78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd\": rpc error: code = NotFound desc = could not find container \"78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd\": container with ID starting with 78a16abe80daeae1b7197f041213d6d295c89fc4653395f56afc32ae64b8e3bd not found: ID does not exist" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.953246 4743 scope.go:117] "RemoveContainer" containerID="5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5" Nov 25 16:16:08 crc kubenswrapper[4743]: I1125 16:16:08.953517 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5"} err="failed to get container status \"5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5\": rpc error: code = NotFound desc = could not find container \"5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5\": container with ID starting with 5d00eeda28cc853c18b619aa6ff31b0921fc62e58022bb2161be7ce98f15acf5 not found: ID does not exist" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.012386 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.012449 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bca4ab88-574a-473c-aab5-bf977973a9d8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.013136 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bca4ab88-574a-473c-aab5-bf977973a9d8-logs\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.013226 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-config-data\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.013461 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.013514 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-scripts\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.013718 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.013950 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkfld\" (UniqueName: \"kubernetes.io/projected/bca4ab88-574a-473c-aab5-bf977973a9d8-kube-api-access-lkfld\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.116004 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-scripts\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.116089 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.116165 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkfld\" (UniqueName: \"kubernetes.io/projected/bca4ab88-574a-473c-aab5-bf977973a9d8-kube-api-access-lkfld\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.116228 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.116258 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bca4ab88-574a-473c-aab5-bf977973a9d8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.116422 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.117092 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bca4ab88-574a-473c-aab5-bf977973a9d8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.117117 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bca4ab88-574a-473c-aab5-bf977973a9d8-logs\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.117248 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-config-data\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.117407 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.117431 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bca4ab88-574a-473c-aab5-bf977973a9d8-logs\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.126560 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-config-data\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.126730 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-scripts\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.129444 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.139277 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.149370 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkfld\" (UniqueName: \"kubernetes.io/projected/bca4ab88-574a-473c-aab5-bf977973a9d8-kube-api-access-lkfld\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.160115 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.172692 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.172728 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.243693 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.244237 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.244657 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.796197 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f7e02f3-87ff-48fb-8431-5b4d830e3cbd" path="/var/lib/kubelet/pods/0f7e02f3-87ff-48fb-8431-5b4d830e3cbd/volumes" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.797760 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad5e0061-9982-4b48-b8f3-877ecc734668" path="/var/lib/kubelet/pods/ad5e0061-9982-4b48-b8f3-877ecc734668/volumes" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.798678 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0eccb42-7531-45d7-9b61-670d338bf6c1" path="/var/lib/kubelet/pods/c0eccb42-7531-45d7-9b61-670d338bf6c1/volumes" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.800019 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd2b63ac-91fc-49e7-8d5e-05a04879baba" path="/var/lib/kubelet/pods/dd2b63ac-91fc-49e7-8d5e-05a04879baba/volumes" Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.800498 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:16:09 crc kubenswrapper[4743]: W1125 16:16:09.808895 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbca4ab88_574a_473c_aab5_bf977973a9d8.slice/crio-287af40f6e3be8c688110947e6748edaed0074cbbcdab2629245c61f3394194d WatchSource:0}: Error finding container 287af40f6e3be8c688110947e6748edaed0074cbbcdab2629245c61f3394194d: Status 404 returned error can't find the container with id 287af40f6e3be8c688110947e6748edaed0074cbbcdab2629245c61f3394194d Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.855552 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f117987a-97b8-4338-a8e2-0e298028faab","Type":"ContainerStarted","Data":"0b4b31082053d4d89977b5633c59d12364ab42641c5d6ee0f0e787c2d83cabc8"} Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.863050 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bca4ab88-574a-473c-aab5-bf977973a9d8","Type":"ContainerStarted","Data":"287af40f6e3be8c688110947e6748edaed0074cbbcdab2629245c61f3394194d"} Nov 25 16:16:09 crc kubenswrapper[4743]: I1125 16:16:09.887040 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=16.887022406 podStartE2EDuration="16.887022406s" podCreationTimestamp="2025-11-25 16:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:09.885345253 +0000 UTC m=+1049.007184822" watchObservedRunningTime="2025-11-25 16:16:09.887022406 +0000 UTC m=+1049.008861955" Nov 25 16:16:10 crc kubenswrapper[4743]: I1125 16:16:10.872536 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bca4ab88-574a-473c-aab5-bf977973a9d8","Type":"ContainerStarted","Data":"83acb6f7093f91075ad4ae7ba0181889d77738a2fed5ab3b50c357dfcd1d35e4"} Nov 25 16:16:11 crc kubenswrapper[4743]: I1125 16:16:11.883712 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bca4ab88-574a-473c-aab5-bf977973a9d8","Type":"ContainerStarted","Data":"49f7563dc2a3fe63fc0b75f6c8e23719ce775916bc28f8c6f2f4ae6c5a107d03"} Nov 25 16:16:11 crc kubenswrapper[4743]: I1125 16:16:11.922625 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.922588135 podStartE2EDuration="3.922588135s" podCreationTimestamp="2025-11-25 16:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:11.911940501 +0000 UTC m=+1051.033780070" watchObservedRunningTime="2025-11-25 16:16:11.922588135 +0000 UTC m=+1051.044427684" Nov 25 16:16:12 crc kubenswrapper[4743]: I1125 16:16:12.893199 4743 generic.go:334] "Generic (PLEG): container finished" podID="7bd5b52c-e448-4f66-ab08-4c24f541412d" containerID="6304cc9db27c76879661d45840a3877910738087c85af9ed315c1a8a4bf37f0c" exitCode=0 Nov 25 16:16:12 crc kubenswrapper[4743]: I1125 16:16:12.893967 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qx98x" event={"ID":"7bd5b52c-e448-4f66-ab08-4c24f541412d","Type":"ContainerDied","Data":"6304cc9db27c76879661d45840a3877910738087c85af9ed315c1a8a4bf37f0c"} Nov 25 16:16:13 crc kubenswrapper[4743]: I1125 16:16:13.904634 4743 generic.go:334] "Generic (PLEG): container finished" podID="24abed0a-5ed2-486b-ace3-d1b07ee69e5f" containerID="eb73eed19eaf8f467c1547d00948479da0a5657dc4d8bfea99addae5c9fcd4e0" exitCode=0 Nov 25 16:16:13 crc kubenswrapper[4743]: I1125 16:16:13.904805 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wdbn9" event={"ID":"24abed0a-5ed2-486b-ace3-d1b07ee69e5f","Type":"ContainerDied","Data":"eb73eed19eaf8f467c1547d00948479da0a5657dc4d8bfea99addae5c9fcd4e0"} Nov 25 16:16:14 crc kubenswrapper[4743]: I1125 16:16:14.309357 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 25 16:16:14 crc kubenswrapper[4743]: I1125 16:16:14.309399 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 25 16:16:14 crc kubenswrapper[4743]: I1125 16:16:14.343194 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 25 16:16:14 crc kubenswrapper[4743]: I1125 16:16:14.361008 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 25 16:16:14 crc kubenswrapper[4743]: I1125 16:16:14.913043 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qx98x" event={"ID":"7bd5b52c-e448-4f66-ab08-4c24f541412d","Type":"ContainerDied","Data":"e84893c1f8eb89ab9951a5c1d148f5783ae8a456496e48bed3e352f5d52c7ee3"} Nov 25 16:16:14 crc kubenswrapper[4743]: I1125 16:16:14.913606 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e84893c1f8eb89ab9951a5c1d148f5783ae8a456496e48bed3e352f5d52c7ee3" Nov 25 16:16:14 crc kubenswrapper[4743]: I1125 16:16:14.913625 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 25 16:16:14 crc kubenswrapper[4743]: I1125 16:16:14.913636 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 25 16:16:14 crc kubenswrapper[4743]: I1125 16:16:14.995649 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.127830 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-credential-keys\") pod \"7bd5b52c-e448-4f66-ab08-4c24f541412d\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.127952 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-scripts\") pod \"7bd5b52c-e448-4f66-ab08-4c24f541412d\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.128065 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-fernet-keys\") pod \"7bd5b52c-e448-4f66-ab08-4c24f541412d\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.128128 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-combined-ca-bundle\") pod \"7bd5b52c-e448-4f66-ab08-4c24f541412d\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.128231 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-config-data\") pod \"7bd5b52c-e448-4f66-ab08-4c24f541412d\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.128258 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89k8l\" (UniqueName: \"kubernetes.io/projected/7bd5b52c-e448-4f66-ab08-4c24f541412d-kube-api-access-89k8l\") pod \"7bd5b52c-e448-4f66-ab08-4c24f541412d\" (UID: \"7bd5b52c-e448-4f66-ab08-4c24f541412d\") " Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.135913 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-scripts" (OuterVolumeSpecName: "scripts") pod "7bd5b52c-e448-4f66-ab08-4c24f541412d" (UID: "7bd5b52c-e448-4f66-ab08-4c24f541412d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.137736 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7bd5b52c-e448-4f66-ab08-4c24f541412d" (UID: "7bd5b52c-e448-4f66-ab08-4c24f541412d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.141104 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bd5b52c-e448-4f66-ab08-4c24f541412d-kube-api-access-89k8l" (OuterVolumeSpecName: "kube-api-access-89k8l") pod "7bd5b52c-e448-4f66-ab08-4c24f541412d" (UID: "7bd5b52c-e448-4f66-ab08-4c24f541412d"). InnerVolumeSpecName "kube-api-access-89k8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.145958 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7bd5b52c-e448-4f66-ab08-4c24f541412d" (UID: "7bd5b52c-e448-4f66-ab08-4c24f541412d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.163098 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-config-data" (OuterVolumeSpecName: "config-data") pod "7bd5b52c-e448-4f66-ab08-4c24f541412d" (UID: "7bd5b52c-e448-4f66-ab08-4c24f541412d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.181695 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bd5b52c-e448-4f66-ab08-4c24f541412d" (UID: "7bd5b52c-e448-4f66-ab08-4c24f541412d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.187712 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wdbn9" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.230483 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.230528 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.230540 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89k8l\" (UniqueName: \"kubernetes.io/projected/7bd5b52c-e448-4f66-ab08-4c24f541412d-kube-api-access-89k8l\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.230555 4743 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.230567 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.230580 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7bd5b52c-e448-4f66-ab08-4c24f541412d-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.331107 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-config\") pod \"24abed0a-5ed2-486b-ace3-d1b07ee69e5f\" (UID: \"24abed0a-5ed2-486b-ace3-d1b07ee69e5f\") " Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.331161 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4zw4\" (UniqueName: \"kubernetes.io/projected/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-kube-api-access-h4zw4\") pod \"24abed0a-5ed2-486b-ace3-d1b07ee69e5f\" (UID: \"24abed0a-5ed2-486b-ace3-d1b07ee69e5f\") " Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.331371 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-combined-ca-bundle\") pod \"24abed0a-5ed2-486b-ace3-d1b07ee69e5f\" (UID: \"24abed0a-5ed2-486b-ace3-d1b07ee69e5f\") " Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.333842 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-kube-api-access-h4zw4" (OuterVolumeSpecName: "kube-api-access-h4zw4") pod "24abed0a-5ed2-486b-ace3-d1b07ee69e5f" (UID: "24abed0a-5ed2-486b-ace3-d1b07ee69e5f"). InnerVolumeSpecName "kube-api-access-h4zw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.356237 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-config" (OuterVolumeSpecName: "config") pod "24abed0a-5ed2-486b-ace3-d1b07ee69e5f" (UID: "24abed0a-5ed2-486b-ace3-d1b07ee69e5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.363331 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24abed0a-5ed2-486b-ace3-d1b07ee69e5f" (UID: "24abed0a-5ed2-486b-ace3-d1b07ee69e5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.435420 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.435460 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4zw4\" (UniqueName: \"kubernetes.io/projected/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-kube-api-access-h4zw4\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.435472 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24abed0a-5ed2-486b-ace3-d1b07ee69e5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.921923 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wdbn9" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.921922 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wdbn9" event={"ID":"24abed0a-5ed2-486b-ace3-d1b07ee69e5f","Type":"ContainerDied","Data":"db9a6585ff487984a1f73748b8343b4c6805ea01bcff9da6c821b032f2afa1a9"} Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.922316 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db9a6585ff487984a1f73748b8343b4c6805ea01bcff9da6c821b032f2afa1a9" Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.924832 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1090d60-66a4-45b2-b37c-fca05d77a7c2","Type":"ContainerStarted","Data":"77ca5aae3db4e70f2c1b6e0b7d9ad1512008f38e516e6c097311a0c55044b50a"} Nov 25 16:16:15 crc kubenswrapper[4743]: I1125 16:16:15.924866 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qx98x" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.092045 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8r569"] Nov 25 16:16:16 crc kubenswrapper[4743]: E1125 16:16:16.092461 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24abed0a-5ed2-486b-ace3-d1b07ee69e5f" containerName="neutron-db-sync" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.092479 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="24abed0a-5ed2-486b-ace3-d1b07ee69e5f" containerName="neutron-db-sync" Nov 25 16:16:16 crc kubenswrapper[4743]: E1125 16:16:16.092491 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd5b52c-e448-4f66-ab08-4c24f541412d" containerName="keystone-bootstrap" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.092498 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd5b52c-e448-4f66-ab08-4c24f541412d" containerName="keystone-bootstrap" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.092688 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="24abed0a-5ed2-486b-ace3-d1b07ee69e5f" containerName="neutron-db-sync" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.092717 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bd5b52c-e448-4f66-ab08-4c24f541412d" containerName="keystone-bootstrap" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.093573 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.101657 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8r569"] Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.148110 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-848747fd7b-bljn8"] Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.149384 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.156988 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.157215 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.157337 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.157465 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-znjss" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.157907 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.158117 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.160397 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-848747fd7b-bljn8"] Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.242883 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b9d9d486d-xrstf"] Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.244526 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.247786 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.248123 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pc24g" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.248244 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.248343 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.249581 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-config\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.249652 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-internal-tls-certs\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.249690 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-public-tls-certs\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.249713 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-dns-svc\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.249732 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.249761 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-scripts\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.249779 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhmjc\" (UniqueName: \"kubernetes.io/projected/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-kube-api-access-lhmjc\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.249806 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-config-data\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.249866 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.249888 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf6nr\" (UniqueName: \"kubernetes.io/projected/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-kube-api-access-kf6nr\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.249909 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-combined-ca-bundle\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.249936 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-credential-keys\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.249953 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-fernet-keys\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.249974 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.261061 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b9d9d486d-xrstf"] Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351254 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-public-tls-certs\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-dns-svc\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351325 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351353 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-combined-ca-bundle\") pod \"neutron-b9d9d486d-xrstf\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351379 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-scripts\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351399 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhmjc\" (UniqueName: \"kubernetes.io/projected/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-kube-api-access-lhmjc\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351422 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-config\") pod \"neutron-b9d9d486d-xrstf\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351441 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-config-data\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351467 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvst8\" (UniqueName: \"kubernetes.io/projected/a08f6f91-e10d-432d-b8da-acd9f692e6bd-kube-api-access-jvst8\") pod \"neutron-b9d9d486d-xrstf\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351495 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351512 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf6nr\" (UniqueName: \"kubernetes.io/projected/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-kube-api-access-kf6nr\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351533 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-combined-ca-bundle\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351558 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-credential-keys\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351575 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-fernet-keys\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351607 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351626 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-ovndb-tls-certs\") pod \"neutron-b9d9d486d-xrstf\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351649 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-config\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351682 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-httpd-config\") pod \"neutron-b9d9d486d-xrstf\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.351702 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-internal-tls-certs\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.353302 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.353471 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-dns-svc\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.353501 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.353970 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.354299 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-config\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.356686 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-credential-keys\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.357462 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-scripts\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.358057 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-internal-tls-certs\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.358294 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-combined-ca-bundle\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.359646 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-public-tls-certs\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.360424 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-fernet-keys\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.364848 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-config-data\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.373984 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhmjc\" (UniqueName: \"kubernetes.io/projected/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-kube-api-access-lhmjc\") pod \"dnsmasq-dns-55f844cf75-8r569\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.377775 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf6nr\" (UniqueName: \"kubernetes.io/projected/0e5a8995-2691-4c7f-baee-bf9cdf1b2427-kube-api-access-kf6nr\") pod \"keystone-848747fd7b-bljn8\" (UID: \"0e5a8995-2691-4c7f-baee-bf9cdf1b2427\") " pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.430142 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.454567 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-combined-ca-bundle\") pod \"neutron-b9d9d486d-xrstf\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.454833 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-config\") pod \"neutron-b9d9d486d-xrstf\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.454917 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvst8\" (UniqueName: \"kubernetes.io/projected/a08f6f91-e10d-432d-b8da-acd9f692e6bd-kube-api-access-jvst8\") pod \"neutron-b9d9d486d-xrstf\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.455065 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-ovndb-tls-certs\") pod \"neutron-b9d9d486d-xrstf\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.455159 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-httpd-config\") pod \"neutron-b9d9d486d-xrstf\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.461712 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-combined-ca-bundle\") pod \"neutron-b9d9d486d-xrstf\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.464780 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-config\") pod \"neutron-b9d9d486d-xrstf\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.477747 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-ovndb-tls-certs\") pod \"neutron-b9d9d486d-xrstf\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.478244 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.488803 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-httpd-config\") pod \"neutron-b9d9d486d-xrstf\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.493792 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvst8\" (UniqueName: \"kubernetes.io/projected/a08f6f91-e10d-432d-b8da-acd9f692e6bd-kube-api-access-jvst8\") pod \"neutron-b9d9d486d-xrstf\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.562934 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.760570 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8r569"] Nov 25 16:16:16 crc kubenswrapper[4743]: I1125 16:16:16.955403 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8r569" event={"ID":"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0","Type":"ContainerStarted","Data":"a47d569b8e5c477800ce1752e373edfd9af0026cc1355c507f82a3b2919923f9"} Nov 25 16:16:17 crc kubenswrapper[4743]: I1125 16:16:17.024105 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 25 16:16:17 crc kubenswrapper[4743]: I1125 16:16:17.024624 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 16:16:17 crc kubenswrapper[4743]: I1125 16:16:17.024842 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 25 16:16:17 crc kubenswrapper[4743]: I1125 16:16:17.151499 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-848747fd7b-bljn8"] Nov 25 16:16:17 crc kubenswrapper[4743]: I1125 16:16:17.372534 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b9d9d486d-xrstf"] Nov 25 16:16:17 crc kubenswrapper[4743]: W1125 16:16:17.377254 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda08f6f91_e10d_432d_b8da_acd9f692e6bd.slice/crio-1d527939cef4dbbdfc34ce72b6db2eed1a8f9b58b8e86108b66c7b96caae434b WatchSource:0}: Error finding container 1d527939cef4dbbdfc34ce72b6db2eed1a8f9b58b8e86108b66c7b96caae434b: Status 404 returned error can't find the container with id 1d527939cef4dbbdfc34ce72b6db2eed1a8f9b58b8e86108b66c7b96caae434b Nov 25 16:16:17 crc kubenswrapper[4743]: I1125 16:16:17.982849 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b9d9d486d-xrstf" event={"ID":"a08f6f91-e10d-432d-b8da-acd9f692e6bd","Type":"ContainerStarted","Data":"2f37558a404668862a517b9a4db9d7f9477c4728e9e8ee51cae5b26209ac8de4"} Nov 25 16:16:17 crc kubenswrapper[4743]: I1125 16:16:17.983132 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b9d9d486d-xrstf" event={"ID":"a08f6f91-e10d-432d-b8da-acd9f692e6bd","Type":"ContainerStarted","Data":"58187b5c9feb18d00ba0264824147f6af105b5a3d44b7de95c83e9560e5c5d44"} Nov 25 16:16:17 crc kubenswrapper[4743]: I1125 16:16:17.983144 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b9d9d486d-xrstf" event={"ID":"a08f6f91-e10d-432d-b8da-acd9f692e6bd","Type":"ContainerStarted","Data":"1d527939cef4dbbdfc34ce72b6db2eed1a8f9b58b8e86108b66c7b96caae434b"} Nov 25 16:16:17 crc kubenswrapper[4743]: I1125 16:16:17.983190 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:17 crc kubenswrapper[4743]: I1125 16:16:17.985110 4743 generic.go:334] "Generic (PLEG): container finished" podID="07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" containerID="ad5bfaf49e6da92508b2c0d1ec7d239dd71259581eebcd7c62ac46d3f9dc182c" exitCode=0 Nov 25 16:16:17 crc kubenswrapper[4743]: I1125 16:16:17.985163 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8r569" event={"ID":"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0","Type":"ContainerDied","Data":"ad5bfaf49e6da92508b2c0d1ec7d239dd71259581eebcd7c62ac46d3f9dc182c"} Nov 25 16:16:18 crc kubenswrapper[4743]: I1125 16:16:18.022600 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b9d9d486d-xrstf" podStartSLOduration=2.022568293 podStartE2EDuration="2.022568293s" podCreationTimestamp="2025-11-25 16:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:18.021658804 +0000 UTC m=+1057.143498353" watchObservedRunningTime="2025-11-25 16:16:18.022568293 +0000 UTC m=+1057.144407842" Nov 25 16:16:18 crc kubenswrapper[4743]: I1125 16:16:18.026892 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-848747fd7b-bljn8" event={"ID":"0e5a8995-2691-4c7f-baee-bf9cdf1b2427","Type":"ContainerStarted","Data":"801096ce4c109666547ae23150cd3ae94c67f21fdc945ef1a4461ea56873831f"} Nov 25 16:16:18 crc kubenswrapper[4743]: I1125 16:16:18.026969 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-848747fd7b-bljn8" event={"ID":"0e5a8995-2691-4c7f-baee-bf9cdf1b2427","Type":"ContainerStarted","Data":"eed6a17c0d6e1fd8656e0a38daf16e65601ca7fc64e4f27b23366dfbcf826770"} Nov 25 16:16:18 crc kubenswrapper[4743]: I1125 16:16:18.027479 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:18 crc kubenswrapper[4743]: I1125 16:16:18.111313 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-848747fd7b-bljn8" podStartSLOduration=2.111291178 podStartE2EDuration="2.111291178s" podCreationTimestamp="2025-11-25 16:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:18.079331135 +0000 UTC m=+1057.201170684" watchObservedRunningTime="2025-11-25 16:16:18.111291178 +0000 UTC m=+1057.233130747" Nov 25 16:16:18 crc kubenswrapper[4743]: I1125 16:16:18.951252 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cc5fc48dc-hkvc8"] Nov 25 16:16:18 crc kubenswrapper[4743]: I1125 16:16:18.955957 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:18 crc kubenswrapper[4743]: I1125 16:16:18.958117 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 25 16:16:18 crc kubenswrapper[4743]: I1125 16:16:18.958322 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 25 16:16:18 crc kubenswrapper[4743]: I1125 16:16:18.960772 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cc5fc48dc-hkvc8"] Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.047542 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8r569" event={"ID":"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0","Type":"ContainerStarted","Data":"afbe6624778d75a40282fbb87b84b05faa4cbcac0b563e94119c4f77bc0eaa58"} Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.050287 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-httpd-config\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.050367 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbvb6\" (UniqueName: \"kubernetes.io/projected/c8823220-9bb8-44a4-a4a6-00661d8e2fad-kube-api-access-jbvb6\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.050398 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-config\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.050464 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-internal-tls-certs\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.050508 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-combined-ca-bundle\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.050567 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-ovndb-tls-certs\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.050612 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-public-tls-certs\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.153671 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-ovndb-tls-certs\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.153718 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-public-tls-certs\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.153751 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-httpd-config\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.153822 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbvb6\" (UniqueName: \"kubernetes.io/projected/c8823220-9bb8-44a4-a4a6-00661d8e2fad-kube-api-access-jbvb6\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.153845 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-config\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.153952 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-internal-tls-certs\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.154025 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-combined-ca-bundle\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.166235 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-internal-tls-certs\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.166809 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-public-tls-certs\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.173264 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-httpd-config\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.175009 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-66f797f6cb-zd4ck" podUID="65514eee-0e20-40f2-b381-21311ae5e899" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.181841 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-combined-ca-bundle\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.182473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-config\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.188223 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8823220-9bb8-44a4-a4a6-00661d8e2fad-ovndb-tls-certs\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.190648 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbvb6\" (UniqueName: \"kubernetes.io/projected/c8823220-9bb8-44a4-a4a6-00661d8e2fad-kube-api-access-jbvb6\") pod \"neutron-cc5fc48dc-hkvc8\" (UID: \"c8823220-9bb8-44a4-a4a6-00661d8e2fad\") " pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.255169 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.255212 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.258423 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7495cddcb-ghpkx" podUID="1e54ceb1-969a-4172-9928-7e424dd38b5b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.280219 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.369991 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 25 16:16:19 crc kubenswrapper[4743]: I1125 16:16:19.384328 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 25 16:16:20 crc kubenswrapper[4743]: I1125 16:16:20.038526 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cc5fc48dc-hkvc8"] Nov 25 16:16:20 crc kubenswrapper[4743]: I1125 16:16:20.057478 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cc5fc48dc-hkvc8" event={"ID":"c8823220-9bb8-44a4-a4a6-00661d8e2fad","Type":"ContainerStarted","Data":"c72af44fe3cf0f86372d2423ea45af1a8fc0744841f822e514881cc34f8c73d3"} Nov 25 16:16:20 crc kubenswrapper[4743]: I1125 16:16:20.057518 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 25 16:16:20 crc kubenswrapper[4743]: I1125 16:16:20.058766 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 25 16:16:20 crc kubenswrapper[4743]: I1125 16:16:20.058788 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:20 crc kubenswrapper[4743]: I1125 16:16:20.088650 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-8r569" podStartSLOduration=4.08863044 podStartE2EDuration="4.08863044s" podCreationTimestamp="2025-11-25 16:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:20.079185433 +0000 UTC m=+1059.201025002" watchObservedRunningTime="2025-11-25 16:16:20.08863044 +0000 UTC m=+1059.210469989" Nov 25 16:16:21 crc kubenswrapper[4743]: I1125 16:16:21.068951 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cc5fc48dc-hkvc8" event={"ID":"c8823220-9bb8-44a4-a4a6-00661d8e2fad","Type":"ContainerStarted","Data":"575f5a1d1012d887f80cd837a7e592f58eaee856e529882e26de4d3d46e52fc5"} Nov 25 16:16:22 crc kubenswrapper[4743]: I1125 16:16:22.074481 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 16:16:22 crc kubenswrapper[4743]: I1125 16:16:22.074835 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 16:16:22 crc kubenswrapper[4743]: I1125 16:16:22.106157 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 25 16:16:22 crc kubenswrapper[4743]: I1125 16:16:22.110023 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 25 16:16:26 crc kubenswrapper[4743]: I1125 16:16:26.107567 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cc5fc48dc-hkvc8" event={"ID":"c8823220-9bb8-44a4-a4a6-00661d8e2fad","Type":"ContainerStarted","Data":"e63aa545940db07cca77c7d97b01e36a00a2040bc859ca527a190177ecf183b7"} Nov 25 16:16:26 crc kubenswrapper[4743]: I1125 16:16:26.108131 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:26 crc kubenswrapper[4743]: I1125 16:16:26.151928 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-cc5fc48dc-hkvc8" podStartSLOduration=8.151909695 podStartE2EDuration="8.151909695s" podCreationTimestamp="2025-11-25 16:16:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:26.144455062 +0000 UTC m=+1065.266294621" watchObservedRunningTime="2025-11-25 16:16:26.151909695 +0000 UTC m=+1065.273749244" Nov 25 16:16:26 crc kubenswrapper[4743]: I1125 16:16:26.432548 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:26 crc kubenswrapper[4743]: I1125 16:16:26.526316 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-cg5jj"] Nov 25 16:16:26 crc kubenswrapper[4743]: I1125 16:16:26.527469 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" podUID="3da25308-0a6b-49be-b9af-c010f9a1945d" containerName="dnsmasq-dns" containerID="cri-o://4c7d7730b31c5b26b34a28f8bdd4da286b0ede847d9f63cd84b135dc5302dd8e" gracePeriod=10 Nov 25 16:16:26 crc kubenswrapper[4743]: I1125 16:16:26.718265 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" podUID="3da25308-0a6b-49be-b9af-c010f9a1945d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: connect: connection refused" Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.125299 4743 generic.go:334] "Generic (PLEG): container finished" podID="3da25308-0a6b-49be-b9af-c010f9a1945d" containerID="4c7d7730b31c5b26b34a28f8bdd4da286b0ede847d9f63cd84b135dc5302dd8e" exitCode=0 Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.125655 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" event={"ID":"3da25308-0a6b-49be-b9af-c010f9a1945d","Type":"ContainerDied","Data":"4c7d7730b31c5b26b34a28f8bdd4da286b0ede847d9f63cd84b135dc5302dd8e"} Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.256124 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.311379 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtbpc\" (UniqueName: \"kubernetes.io/projected/3da25308-0a6b-49be-b9af-c010f9a1945d-kube-api-access-vtbpc\") pod \"3da25308-0a6b-49be-b9af-c010f9a1945d\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.311445 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-dns-swift-storage-0\") pod \"3da25308-0a6b-49be-b9af-c010f9a1945d\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.311476 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-dns-svc\") pod \"3da25308-0a6b-49be-b9af-c010f9a1945d\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.311558 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-config\") pod \"3da25308-0a6b-49be-b9af-c010f9a1945d\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.311579 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-ovsdbserver-nb\") pod \"3da25308-0a6b-49be-b9af-c010f9a1945d\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.311612 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-ovsdbserver-sb\") pod \"3da25308-0a6b-49be-b9af-c010f9a1945d\" (UID: \"3da25308-0a6b-49be-b9af-c010f9a1945d\") " Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.318718 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3da25308-0a6b-49be-b9af-c010f9a1945d-kube-api-access-vtbpc" (OuterVolumeSpecName: "kube-api-access-vtbpc") pod "3da25308-0a6b-49be-b9af-c010f9a1945d" (UID: "3da25308-0a6b-49be-b9af-c010f9a1945d"). InnerVolumeSpecName "kube-api-access-vtbpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.381699 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3da25308-0a6b-49be-b9af-c010f9a1945d" (UID: "3da25308-0a6b-49be-b9af-c010f9a1945d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.383723 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-config" (OuterVolumeSpecName: "config") pod "3da25308-0a6b-49be-b9af-c010f9a1945d" (UID: "3da25308-0a6b-49be-b9af-c010f9a1945d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.393824 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3da25308-0a6b-49be-b9af-c010f9a1945d" (UID: "3da25308-0a6b-49be-b9af-c010f9a1945d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.395482 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3da25308-0a6b-49be-b9af-c010f9a1945d" (UID: "3da25308-0a6b-49be-b9af-c010f9a1945d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.407179 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3da25308-0a6b-49be-b9af-c010f9a1945d" (UID: "3da25308-0a6b-49be-b9af-c010f9a1945d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.412805 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.412830 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.412839 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.412847 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.412859 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3da25308-0a6b-49be-b9af-c010f9a1945d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:27 crc kubenswrapper[4743]: I1125 16:16:27.412869 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtbpc\" (UniqueName: \"kubernetes.io/projected/3da25308-0a6b-49be-b9af-c010f9a1945d-kube-api-access-vtbpc\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.136708 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="ceilometer-central-agent" containerID="cri-o://4f0da2d223788415cd89ad3d4f0a8e3ec1c0c924891e4d67b0c6c839f9bb917b" gracePeriod=30 Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.136747 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="proxy-httpd" containerID="cri-o://93ef641d94e456b042a83b541b33e5a7648530f145123f3b9410d769eb45ed36" gracePeriod=30 Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.136762 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="sg-core" containerID="cri-o://77ca5aae3db4e70f2c1b6e0b7d9ad1512008f38e516e6c097311a0c55044b50a" gracePeriod=30 Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.136803 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="ceilometer-notification-agent" containerID="cri-o://e07b1cb797bd7b8e627e801949d1ca9f975054d787e04f2628340ec9a250ce50" gracePeriod=30 Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.136804 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1090d60-66a4-45b2-b37c-fca05d77a7c2","Type":"ContainerStarted","Data":"93ef641d94e456b042a83b541b33e5a7648530f145123f3b9410d769eb45ed36"} Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.139093 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.139820 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-cg5jj" event={"ID":"3da25308-0a6b-49be-b9af-c010f9a1945d","Type":"ContainerDied","Data":"3c77e6dc3182c6bf9d282ced5a542a003d4ef9dc4b2429da04df5322e56a3ed1"} Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.139850 4743 scope.go:117] "RemoveContainer" containerID="4c7d7730b31c5b26b34a28f8bdd4da286b0ede847d9f63cd84b135dc5302dd8e" Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.152011 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2lqmd" event={"ID":"5a037faa-f5b9-4abb-8132-2750befdf031","Type":"ContainerStarted","Data":"9ee9e9de02a98ab5efcaf524bd46dfeaa0769284a00cafa5cda9abedfc1a3381"} Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.154138 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mqvnl" event={"ID":"92e09359-debb-49f3-8490-c18e8ca5f63e","Type":"ContainerStarted","Data":"22161b8d3ffa7ed61b572c2d37f445d76c6c813c368511f7e966c7dd4571704d"} Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.162513 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9vplz" event={"ID":"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba","Type":"ContainerStarted","Data":"591f6779b50fcb1835cc05a42164ab8f18ebc40ba20de925362260c2ab31f8ce"} Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.170525 4743 scope.go:117] "RemoveContainer" containerID="97af644eeffd8f3c88600642ff8cd8513feb6479e102525f5042d02208bfdbf9" Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.181028 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.878085495 podStartE2EDuration="53.181006371s" podCreationTimestamp="2025-11-25 16:15:35 +0000 UTC" firstStartedPulling="2025-11-25 16:15:37.248369818 +0000 UTC m=+1016.370209367" lastFinishedPulling="2025-11-25 16:16:26.551290694 +0000 UTC m=+1065.673130243" observedRunningTime="2025-11-25 16:16:28.160919531 +0000 UTC m=+1067.282759090" watchObservedRunningTime="2025-11-25 16:16:28.181006371 +0000 UTC m=+1067.302845920" Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.187574 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2lqmd" podStartSLOduration=4.03555636 podStartE2EDuration="53.187559028s" podCreationTimestamp="2025-11-25 16:15:35 +0000 UTC" firstStartedPulling="2025-11-25 16:15:37.313881595 +0000 UTC m=+1016.435721144" lastFinishedPulling="2025-11-25 16:16:26.465884263 +0000 UTC m=+1065.587723812" observedRunningTime="2025-11-25 16:16:28.176463239 +0000 UTC m=+1067.298302788" watchObservedRunningTime="2025-11-25 16:16:28.187559028 +0000 UTC m=+1067.309398577" Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.208712 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-mqvnl" podStartSLOduration=3.756347484 podStartE2EDuration="53.208695411s" podCreationTimestamp="2025-11-25 16:15:35 +0000 UTC" firstStartedPulling="2025-11-25 16:15:37.013195335 +0000 UTC m=+1016.135034884" lastFinishedPulling="2025-11-25 16:16:26.465543252 +0000 UTC m=+1065.587382811" observedRunningTime="2025-11-25 16:16:28.204008414 +0000 UTC m=+1067.325847963" watchObservedRunningTime="2025-11-25 16:16:28.208695411 +0000 UTC m=+1067.330534960" Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.228910 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-cg5jj"] Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.237438 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-cg5jj"] Nov 25 16:16:28 crc kubenswrapper[4743]: I1125 16:16:28.247447 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9vplz" podStartSLOduration=3.781714801 podStartE2EDuration="53.247430168s" podCreationTimestamp="2025-11-25 16:15:35 +0000 UTC" firstStartedPulling="2025-11-25 16:15:37.01049635 +0000 UTC m=+1016.132335889" lastFinishedPulling="2025-11-25 16:16:26.476211707 +0000 UTC m=+1065.598051256" observedRunningTime="2025-11-25 16:16:28.237989781 +0000 UTC m=+1067.359829340" watchObservedRunningTime="2025-11-25 16:16:28.247430168 +0000 UTC m=+1067.369269717" Nov 25 16:16:29 crc kubenswrapper[4743]: I1125 16:16:29.177063 4743 generic.go:334] "Generic (PLEG): container finished" podID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerID="93ef641d94e456b042a83b541b33e5a7648530f145123f3b9410d769eb45ed36" exitCode=0 Nov 25 16:16:29 crc kubenswrapper[4743]: I1125 16:16:29.177486 4743 generic.go:334] "Generic (PLEG): container finished" podID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerID="77ca5aae3db4e70f2c1b6e0b7d9ad1512008f38e516e6c097311a0c55044b50a" exitCode=2 Nov 25 16:16:29 crc kubenswrapper[4743]: I1125 16:16:29.177498 4743 generic.go:334] "Generic (PLEG): container finished" podID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerID="4f0da2d223788415cd89ad3d4f0a8e3ec1c0c924891e4d67b0c6c839f9bb917b" exitCode=0 Nov 25 16:16:29 crc kubenswrapper[4743]: I1125 16:16:29.177181 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1090d60-66a4-45b2-b37c-fca05d77a7c2","Type":"ContainerDied","Data":"93ef641d94e456b042a83b541b33e5a7648530f145123f3b9410d769eb45ed36"} Nov 25 16:16:29 crc kubenswrapper[4743]: I1125 16:16:29.177566 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1090d60-66a4-45b2-b37c-fca05d77a7c2","Type":"ContainerDied","Data":"77ca5aae3db4e70f2c1b6e0b7d9ad1512008f38e516e6c097311a0c55044b50a"} Nov 25 16:16:29 crc kubenswrapper[4743]: I1125 16:16:29.177580 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1090d60-66a4-45b2-b37c-fca05d77a7c2","Type":"ContainerDied","Data":"4f0da2d223788415cd89ad3d4f0a8e3ec1c0c924891e4d67b0c6c839f9bb917b"} Nov 25 16:16:29 crc kubenswrapper[4743]: I1125 16:16:29.789884 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3da25308-0a6b-49be-b9af-c010f9a1945d" path="/var/lib/kubelet/pods/3da25308-0a6b-49be-b9af-c010f9a1945d/volumes" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.189124 4743 generic.go:334] "Generic (PLEG): container finished" podID="5a037faa-f5b9-4abb-8132-2750befdf031" containerID="9ee9e9de02a98ab5efcaf524bd46dfeaa0769284a00cafa5cda9abedfc1a3381" exitCode=0 Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.189263 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2lqmd" event={"ID":"5a037faa-f5b9-4abb-8132-2750befdf031","Type":"ContainerDied","Data":"9ee9e9de02a98ab5efcaf524bd46dfeaa0769284a00cafa5cda9abedfc1a3381"} Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.193639 4743 generic.go:334] "Generic (PLEG): container finished" podID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerID="e07b1cb797bd7b8e627e801949d1ca9f975054d787e04f2628340ec9a250ce50" exitCode=0 Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.193676 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1090d60-66a4-45b2-b37c-fca05d77a7c2","Type":"ContainerDied","Data":"e07b1cb797bd7b8e627e801949d1ca9f975054d787e04f2628340ec9a250ce50"} Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.193697 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1090d60-66a4-45b2-b37c-fca05d77a7c2","Type":"ContainerDied","Data":"2ccb2949efe37ffa7e6915b738b1ebfc9f2c26faaef5b9912f80767aaf9cb39a"} Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.193707 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ccb2949efe37ffa7e6915b738b1ebfc9f2c26faaef5b9912f80767aaf9cb39a" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.265754 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.462348 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1090d60-66a4-45b2-b37c-fca05d77a7c2-log-httpd\") pod \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.462405 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-config-data\") pod \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.462502 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-sg-core-conf-yaml\") pod \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.462563 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-scripts\") pod \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.462605 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-combined-ca-bundle\") pod \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.462647 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1090d60-66a4-45b2-b37c-fca05d77a7c2-run-httpd\") pod \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.462670 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqkzj\" (UniqueName: \"kubernetes.io/projected/e1090d60-66a4-45b2-b37c-fca05d77a7c2-kube-api-access-vqkzj\") pod \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\" (UID: \"e1090d60-66a4-45b2-b37c-fca05d77a7c2\") " Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.462915 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1090d60-66a4-45b2-b37c-fca05d77a7c2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e1090d60-66a4-45b2-b37c-fca05d77a7c2" (UID: "e1090d60-66a4-45b2-b37c-fca05d77a7c2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.463124 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1090d60-66a4-45b2-b37c-fca05d77a7c2-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.463381 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1090d60-66a4-45b2-b37c-fca05d77a7c2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e1090d60-66a4-45b2-b37c-fca05d77a7c2" (UID: "e1090d60-66a4-45b2-b37c-fca05d77a7c2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.468102 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-scripts" (OuterVolumeSpecName: "scripts") pod "e1090d60-66a4-45b2-b37c-fca05d77a7c2" (UID: "e1090d60-66a4-45b2-b37c-fca05d77a7c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.480888 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1090d60-66a4-45b2-b37c-fca05d77a7c2-kube-api-access-vqkzj" (OuterVolumeSpecName: "kube-api-access-vqkzj") pod "e1090d60-66a4-45b2-b37c-fca05d77a7c2" (UID: "e1090d60-66a4-45b2-b37c-fca05d77a7c2"). InnerVolumeSpecName "kube-api-access-vqkzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.510847 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e1090d60-66a4-45b2-b37c-fca05d77a7c2" (UID: "e1090d60-66a4-45b2-b37c-fca05d77a7c2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.556978 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1090d60-66a4-45b2-b37c-fca05d77a7c2" (UID: "e1090d60-66a4-45b2-b37c-fca05d77a7c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.560358 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-config-data" (OuterVolumeSpecName: "config-data") pod "e1090d60-66a4-45b2-b37c-fca05d77a7c2" (UID: "e1090d60-66a4-45b2-b37c-fca05d77a7c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.565033 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.565067 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.565081 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.565092 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1090d60-66a4-45b2-b37c-fca05d77a7c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.565102 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1090d60-66a4-45b2-b37c-fca05d77a7c2-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:30 crc kubenswrapper[4743]: I1125 16:16:30.565115 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqkzj\" (UniqueName: \"kubernetes.io/projected/e1090d60-66a4-45b2-b37c-fca05d77a7c2-kube-api-access-vqkzj\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.179760 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.204753 4743 generic.go:334] "Generic (PLEG): container finished" podID="92e09359-debb-49f3-8490-c18e8ca5f63e" containerID="22161b8d3ffa7ed61b572c2d37f445d76c6c813c368511f7e966c7dd4571704d" exitCode=0 Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.204796 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mqvnl" event={"ID":"92e09359-debb-49f3-8490-c18e8ca5f63e","Type":"ContainerDied","Data":"22161b8d3ffa7ed61b572c2d37f445d76c6c813c368511f7e966c7dd4571704d"} Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.204927 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.261329 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.271793 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.284020 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:16:31 crc kubenswrapper[4743]: E1125 16:16:31.284474 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="ceilometer-notification-agent" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.284499 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="ceilometer-notification-agent" Nov 25 16:16:31 crc kubenswrapper[4743]: E1125 16:16:31.284527 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="ceilometer-central-agent" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.284538 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="ceilometer-central-agent" Nov 25 16:16:31 crc kubenswrapper[4743]: E1125 16:16:31.284550 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da25308-0a6b-49be-b9af-c010f9a1945d" containerName="init" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.284558 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da25308-0a6b-49be-b9af-c010f9a1945d" containerName="init" Nov 25 16:16:31 crc kubenswrapper[4743]: E1125 16:16:31.284577 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="proxy-httpd" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.284586 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="proxy-httpd" Nov 25 16:16:31 crc kubenswrapper[4743]: E1125 16:16:31.284646 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da25308-0a6b-49be-b9af-c010f9a1945d" containerName="dnsmasq-dns" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.284655 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da25308-0a6b-49be-b9af-c010f9a1945d" containerName="dnsmasq-dns" Nov 25 16:16:31 crc kubenswrapper[4743]: E1125 16:16:31.284676 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="sg-core" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.284685 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="sg-core" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.284900 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="sg-core" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.284964 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="proxy-httpd" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.284981 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="ceilometer-central-agent" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.285002 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" containerName="ceilometer-notification-agent" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.285019 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da25308-0a6b-49be-b9af-c010f9a1945d" containerName="dnsmasq-dns" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.288003 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.293862 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.295554 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.295696 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.323546 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.380076 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.380150 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-config-data\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.380207 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae733a9b-cace-4e47-8c89-0b1adf03600a-run-httpd\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.380229 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae733a9b-cace-4e47-8c89-0b1adf03600a-log-httpd\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.380244 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-scripts\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.380307 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92trk\" (UniqueName: \"kubernetes.io/projected/ae733a9b-cace-4e47-8c89-0b1adf03600a-kube-api-access-92trk\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.380390 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.500197 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-config-data\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.500300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae733a9b-cace-4e47-8c89-0b1adf03600a-run-httpd\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.500320 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae733a9b-cace-4e47-8c89-0b1adf03600a-log-httpd\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.500335 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-scripts\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.500432 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92trk\" (UniqueName: \"kubernetes.io/projected/ae733a9b-cace-4e47-8c89-0b1adf03600a-kube-api-access-92trk\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.500552 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.500661 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.501126 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae733a9b-cace-4e47-8c89-0b1adf03600a-run-httpd\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.502938 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae733a9b-cace-4e47-8c89-0b1adf03600a-log-httpd\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.507286 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.509671 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-scripts\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.515293 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-config-data\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.515696 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.518993 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92trk\" (UniqueName: \"kubernetes.io/projected/ae733a9b-cace-4e47-8c89-0b1adf03600a-kube-api-access-92trk\") pod \"ceilometer-0\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.595717 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2lqmd" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.608430 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.703092 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-config-data\") pod \"5a037faa-f5b9-4abb-8132-2750befdf031\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.703420 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-combined-ca-bundle\") pod \"5a037faa-f5b9-4abb-8132-2750befdf031\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.703444 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-scripts\") pod \"5a037faa-f5b9-4abb-8132-2750befdf031\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.703519 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2f2s\" (UniqueName: \"kubernetes.io/projected/5a037faa-f5b9-4abb-8132-2750befdf031-kube-api-access-t2f2s\") pod \"5a037faa-f5b9-4abb-8132-2750befdf031\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.703571 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a037faa-f5b9-4abb-8132-2750befdf031-logs\") pod \"5a037faa-f5b9-4abb-8132-2750befdf031\" (UID: \"5a037faa-f5b9-4abb-8132-2750befdf031\") " Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.704113 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a037faa-f5b9-4abb-8132-2750befdf031-logs" (OuterVolumeSpecName: "logs") pod "5a037faa-f5b9-4abb-8132-2750befdf031" (UID: "5a037faa-f5b9-4abb-8132-2750befdf031"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.706904 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a037faa-f5b9-4abb-8132-2750befdf031-kube-api-access-t2f2s" (OuterVolumeSpecName: "kube-api-access-t2f2s") pod "5a037faa-f5b9-4abb-8132-2750befdf031" (UID: "5a037faa-f5b9-4abb-8132-2750befdf031"). InnerVolumeSpecName "kube-api-access-t2f2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.708450 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-scripts" (OuterVolumeSpecName: "scripts") pod "5a037faa-f5b9-4abb-8132-2750befdf031" (UID: "5a037faa-f5b9-4abb-8132-2750befdf031"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.730120 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a037faa-f5b9-4abb-8132-2750befdf031" (UID: "5a037faa-f5b9-4abb-8132-2750befdf031"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.736185 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-config-data" (OuterVolumeSpecName: "config-data") pod "5a037faa-f5b9-4abb-8132-2750befdf031" (UID: "5a037faa-f5b9-4abb-8132-2750befdf031"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.786514 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1090d60-66a4-45b2-b37c-fca05d77a7c2" path="/var/lib/kubelet/pods/e1090d60-66a4-45b2-b37c-fca05d77a7c2/volumes" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.806060 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.806092 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.806106 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a037faa-f5b9-4abb-8132-2750befdf031-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.806115 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2f2s\" (UniqueName: \"kubernetes.io/projected/5a037faa-f5b9-4abb-8132-2750befdf031-kube-api-access-t2f2s\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:31 crc kubenswrapper[4743]: I1125 16:16:31.806125 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a037faa-f5b9-4abb-8132-2750befdf031-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.054380 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.215444 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2lqmd" event={"ID":"5a037faa-f5b9-4abb-8132-2750befdf031","Type":"ContainerDied","Data":"0278e84e216431654f027622aca6ab372c9a0a02b8e29a33297c26ec29c591c1"} Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.215503 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0278e84e216431654f027622aca6ab372c9a0a02b8e29a33297c26ec29c591c1" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.215574 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2lqmd" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.218156 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae733a9b-cace-4e47-8c89-0b1adf03600a","Type":"ContainerStarted","Data":"2dc2ac55b579c4a3894e80de00264facc183707761e957ed6ab9464d8ca0e724"} Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.361191 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6dd8557654-lgr92"] Nov 25 16:16:32 crc kubenswrapper[4743]: E1125 16:16:32.366383 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a037faa-f5b9-4abb-8132-2750befdf031" containerName="placement-db-sync" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.366420 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a037faa-f5b9-4abb-8132-2750befdf031" containerName="placement-db-sync" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.366688 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a037faa-f5b9-4abb-8132-2750befdf031" containerName="placement-db-sync" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.372111 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.378573 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.378923 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.379535 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.379721 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7zrbn" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.379891 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.399820 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6dd8557654-lgr92"] Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.452564 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/659f8a21-e29e-47be-903b-742de8ec9b22-logs\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.452637 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/659f8a21-e29e-47be-903b-742de8ec9b22-public-tls-certs\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.452681 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659f8a21-e29e-47be-903b-742de8ec9b22-combined-ca-bundle\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.452730 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/659f8a21-e29e-47be-903b-742de8ec9b22-config-data\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.452750 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/659f8a21-e29e-47be-903b-742de8ec9b22-scripts\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.452777 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clwn6\" (UniqueName: \"kubernetes.io/projected/659f8a21-e29e-47be-903b-742de8ec9b22-kube-api-access-clwn6\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.452803 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/659f8a21-e29e-47be-903b-742de8ec9b22-internal-tls-certs\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.555331 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/659f8a21-e29e-47be-903b-742de8ec9b22-logs\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.555377 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/659f8a21-e29e-47be-903b-742de8ec9b22-public-tls-certs\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.555436 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659f8a21-e29e-47be-903b-742de8ec9b22-combined-ca-bundle\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.555497 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/659f8a21-e29e-47be-903b-742de8ec9b22-config-data\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.555522 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/659f8a21-e29e-47be-903b-742de8ec9b22-scripts\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.555553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clwn6\" (UniqueName: \"kubernetes.io/projected/659f8a21-e29e-47be-903b-742de8ec9b22-kube-api-access-clwn6\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.555607 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/659f8a21-e29e-47be-903b-742de8ec9b22-internal-tls-certs\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.556771 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/659f8a21-e29e-47be-903b-742de8ec9b22-logs\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.563457 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/659f8a21-e29e-47be-903b-742de8ec9b22-public-tls-certs\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.568899 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/659f8a21-e29e-47be-903b-742de8ec9b22-combined-ca-bundle\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.569256 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/659f8a21-e29e-47be-903b-742de8ec9b22-internal-tls-certs\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.569272 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/659f8a21-e29e-47be-903b-742de8ec9b22-config-data\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.577024 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/659f8a21-e29e-47be-903b-742de8ec9b22-scripts\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.578274 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clwn6\" (UniqueName: \"kubernetes.io/projected/659f8a21-e29e-47be-903b-742de8ec9b22-kube-api-access-clwn6\") pod \"placement-6dd8557654-lgr92\" (UID: \"659f8a21-e29e-47be-903b-742de8ec9b22\") " pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.653017 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mqvnl" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.712515 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.758711 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/92e09359-debb-49f3-8490-c18e8ca5f63e-db-sync-config-data\") pod \"92e09359-debb-49f3-8490-c18e8ca5f63e\" (UID: \"92e09359-debb-49f3-8490-c18e8ca5f63e\") " Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.759195 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcrlp\" (UniqueName: \"kubernetes.io/projected/92e09359-debb-49f3-8490-c18e8ca5f63e-kube-api-access-bcrlp\") pod \"92e09359-debb-49f3-8490-c18e8ca5f63e\" (UID: \"92e09359-debb-49f3-8490-c18e8ca5f63e\") " Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.759243 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e09359-debb-49f3-8490-c18e8ca5f63e-combined-ca-bundle\") pod \"92e09359-debb-49f3-8490-c18e8ca5f63e\" (UID: \"92e09359-debb-49f3-8490-c18e8ca5f63e\") " Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.761853 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e09359-debb-49f3-8490-c18e8ca5f63e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "92e09359-debb-49f3-8490-c18e8ca5f63e" (UID: "92e09359-debb-49f3-8490-c18e8ca5f63e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.763308 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92e09359-debb-49f3-8490-c18e8ca5f63e-kube-api-access-bcrlp" (OuterVolumeSpecName: "kube-api-access-bcrlp") pod "92e09359-debb-49f3-8490-c18e8ca5f63e" (UID: "92e09359-debb-49f3-8490-c18e8ca5f63e"). InnerVolumeSpecName "kube-api-access-bcrlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.783738 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e09359-debb-49f3-8490-c18e8ca5f63e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92e09359-debb-49f3-8490-c18e8ca5f63e" (UID: "92e09359-debb-49f3-8490-c18e8ca5f63e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.861874 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/92e09359-debb-49f3-8490-c18e8ca5f63e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.861907 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcrlp\" (UniqueName: \"kubernetes.io/projected/92e09359-debb-49f3-8490-c18e8ca5f63e-kube-api-access-bcrlp\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.861921 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e09359-debb-49f3-8490-c18e8ca5f63e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.911967 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7495cddcb-ghpkx" Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.981067 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66f797f6cb-zd4ck"] Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.981347 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66f797f6cb-zd4ck" podUID="65514eee-0e20-40f2-b381-21311ae5e899" containerName="horizon-log" containerID="cri-o://7f692b888d980122a2a3b6fba9b74f9dbc88be7dfc5a20925b9cd7fd1dea23a9" gracePeriod=30 Nov 25 16:16:32 crc kubenswrapper[4743]: I1125 16:16:32.983738 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66f797f6cb-zd4ck" podUID="65514eee-0e20-40f2-b381-21311ae5e899" containerName="horizon" containerID="cri-o://c7efc3bf5ec0a3a745508cfbeedbf2a676ec3389b99cbb20a3ba37972d4920be" gracePeriod=30 Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.019604 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66f797f6cb-zd4ck" podUID="65514eee-0e20-40f2-b381-21311ae5e899" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.158482 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6dd8557654-lgr92"] Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.230465 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mqvnl" event={"ID":"92e09359-debb-49f3-8490-c18e8ca5f63e","Type":"ContainerDied","Data":"4ad6177fdf25772bd0b04add034d299083beb0a5fa849328fb0a329df93fad5e"} Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.231657 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ad6177fdf25772bd0b04add034d299083beb0a5fa849328fb0a329df93fad5e" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.231883 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mqvnl" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.238516 4743 generic.go:334] "Generic (PLEG): container finished" podID="0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba" containerID="591f6779b50fcb1835cc05a42164ab8f18ebc40ba20de925362260c2ab31f8ce" exitCode=0 Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.238584 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9vplz" event={"ID":"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba","Type":"ContainerDied","Data":"591f6779b50fcb1835cc05a42164ab8f18ebc40ba20de925362260c2ab31f8ce"} Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.240574 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6dd8557654-lgr92" event={"ID":"659f8a21-e29e-47be-903b-742de8ec9b22","Type":"ContainerStarted","Data":"bfaccdcd19c40216f2b84230812e9720d67daaae1643574f2382a65c6047c0f2"} Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.245179 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae733a9b-cace-4e47-8c89-0b1adf03600a","Type":"ContainerStarted","Data":"53df142b9bba71f1886ea8ec33b4a900678c6db441cc4eb1f79b40dbdb53bc45"} Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.471533 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-c9cfd9b5-p7l4w"] Nov 25 16:16:33 crc kubenswrapper[4743]: E1125 16:16:33.472385 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92e09359-debb-49f3-8490-c18e8ca5f63e" containerName="barbican-db-sync" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.472409 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e09359-debb-49f3-8490-c18e8ca5f63e" containerName="barbican-db-sync" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.479015 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="92e09359-debb-49f3-8490-c18e8ca5f63e" containerName="barbican-db-sync" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.480688 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.486334 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.486870 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bhzgf" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.487084 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.502706 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-c9cfd9b5-p7l4w"] Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.535711 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-74944f4b54-xg775"] Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.538533 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.542779 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.550209 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-74944f4b54-xg775"] Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.588389 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef127ba1-444d-4f1c-937b-965c7ce47d1a-config-data\") pod \"barbican-worker-c9cfd9b5-p7l4w\" (UID: \"ef127ba1-444d-4f1c-937b-965c7ce47d1a\") " pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.588423 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f59f0-0093-4e6d-8aa3-0ddc0161b213-combined-ca-bundle\") pod \"barbican-keystone-listener-74944f4b54-xg775\" (UID: \"095f59f0-0093-4e6d-8aa3-0ddc0161b213\") " pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.588450 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef127ba1-444d-4f1c-937b-965c7ce47d1a-logs\") pod \"barbican-worker-c9cfd9b5-p7l4w\" (UID: \"ef127ba1-444d-4f1c-937b-965c7ce47d1a\") " pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.588517 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef127ba1-444d-4f1c-937b-965c7ce47d1a-config-data-custom\") pod \"barbican-worker-c9cfd9b5-p7l4w\" (UID: \"ef127ba1-444d-4f1c-937b-965c7ce47d1a\") " pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.588555 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095f59f0-0093-4e6d-8aa3-0ddc0161b213-config-data\") pod \"barbican-keystone-listener-74944f4b54-xg775\" (UID: \"095f59f0-0093-4e6d-8aa3-0ddc0161b213\") " pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.588569 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/095f59f0-0093-4e6d-8aa3-0ddc0161b213-logs\") pod \"barbican-keystone-listener-74944f4b54-xg775\" (UID: \"095f59f0-0093-4e6d-8aa3-0ddc0161b213\") " pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.590568 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef127ba1-444d-4f1c-937b-965c7ce47d1a-combined-ca-bundle\") pod \"barbican-worker-c9cfd9b5-p7l4w\" (UID: \"ef127ba1-444d-4f1c-937b-965c7ce47d1a\") " pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.591080 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbtc9\" (UniqueName: \"kubernetes.io/projected/095f59f0-0093-4e6d-8aa3-0ddc0161b213-kube-api-access-wbtc9\") pod \"barbican-keystone-listener-74944f4b54-xg775\" (UID: \"095f59f0-0093-4e6d-8aa3-0ddc0161b213\") " pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.591116 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cdr2\" (UniqueName: \"kubernetes.io/projected/ef127ba1-444d-4f1c-937b-965c7ce47d1a-kube-api-access-5cdr2\") pod \"barbican-worker-c9cfd9b5-p7l4w\" (UID: \"ef127ba1-444d-4f1c-937b-965c7ce47d1a\") " pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.591142 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/095f59f0-0093-4e6d-8aa3-0ddc0161b213-config-data-custom\") pod \"barbican-keystone-listener-74944f4b54-xg775\" (UID: \"095f59f0-0093-4e6d-8aa3-0ddc0161b213\") " pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.639537 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-kq6fr"] Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.645317 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.658867 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-kq6fr"] Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693082 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef127ba1-444d-4f1c-937b-965c7ce47d1a-config-data\") pod \"barbican-worker-c9cfd9b5-p7l4w\" (UID: \"ef127ba1-444d-4f1c-937b-965c7ce47d1a\") " pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693146 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f59f0-0093-4e6d-8aa3-0ddc0161b213-combined-ca-bundle\") pod \"barbican-keystone-listener-74944f4b54-xg775\" (UID: \"095f59f0-0093-4e6d-8aa3-0ddc0161b213\") " pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693171 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-dns-svc\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693194 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef127ba1-444d-4f1c-937b-965c7ce47d1a-logs\") pod \"barbican-worker-c9cfd9b5-p7l4w\" (UID: \"ef127ba1-444d-4f1c-937b-965c7ce47d1a\") " pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693242 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef127ba1-444d-4f1c-937b-965c7ce47d1a-config-data-custom\") pod \"barbican-worker-c9cfd9b5-p7l4w\" (UID: \"ef127ba1-444d-4f1c-937b-965c7ce47d1a\") " pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693274 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693311 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/095f59f0-0093-4e6d-8aa3-0ddc0161b213-logs\") pod \"barbican-keystone-listener-74944f4b54-xg775\" (UID: \"095f59f0-0093-4e6d-8aa3-0ddc0161b213\") " pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693335 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095f59f0-0093-4e6d-8aa3-0ddc0161b213-config-data\") pod \"barbican-keystone-listener-74944f4b54-xg775\" (UID: \"095f59f0-0093-4e6d-8aa3-0ddc0161b213\") " pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693360 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-config\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693420 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef127ba1-444d-4f1c-937b-965c7ce47d1a-combined-ca-bundle\") pod \"barbican-worker-c9cfd9b5-p7l4w\" (UID: \"ef127ba1-444d-4f1c-937b-965c7ce47d1a\") " pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693454 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbtc9\" (UniqueName: \"kubernetes.io/projected/095f59f0-0093-4e6d-8aa3-0ddc0161b213-kube-api-access-wbtc9\") pod \"barbican-keystone-listener-74944f4b54-xg775\" (UID: \"095f59f0-0093-4e6d-8aa3-0ddc0161b213\") " pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693487 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cdr2\" (UniqueName: \"kubernetes.io/projected/ef127ba1-444d-4f1c-937b-965c7ce47d1a-kube-api-access-5cdr2\") pod \"barbican-worker-c9cfd9b5-p7l4w\" (UID: \"ef127ba1-444d-4f1c-937b-965c7ce47d1a\") " pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693514 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4c8x\" (UniqueName: \"kubernetes.io/projected/bf8182fd-592f-4c32-8270-a574eacb5638-kube-api-access-s4c8x\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693540 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/095f59f0-0093-4e6d-8aa3-0ddc0161b213-config-data-custom\") pod \"barbican-keystone-listener-74944f4b54-xg775\" (UID: \"095f59f0-0093-4e6d-8aa3-0ddc0161b213\") " pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693581 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.693646 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.699901 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef127ba1-444d-4f1c-937b-965c7ce47d1a-config-data\") pod \"barbican-worker-c9cfd9b5-p7l4w\" (UID: \"ef127ba1-444d-4f1c-937b-965c7ce47d1a\") " pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.700175 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef127ba1-444d-4f1c-937b-965c7ce47d1a-logs\") pod \"barbican-worker-c9cfd9b5-p7l4w\" (UID: \"ef127ba1-444d-4f1c-937b-965c7ce47d1a\") " pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.701902 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095f59f0-0093-4e6d-8aa3-0ddc0161b213-config-data\") pod \"barbican-keystone-listener-74944f4b54-xg775\" (UID: \"095f59f0-0093-4e6d-8aa3-0ddc0161b213\") " pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.703335 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/095f59f0-0093-4e6d-8aa3-0ddc0161b213-logs\") pod \"barbican-keystone-listener-74944f4b54-xg775\" (UID: \"095f59f0-0093-4e6d-8aa3-0ddc0161b213\") " pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.705319 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef127ba1-444d-4f1c-937b-965c7ce47d1a-combined-ca-bundle\") pod \"barbican-worker-c9cfd9b5-p7l4w\" (UID: \"ef127ba1-444d-4f1c-937b-965c7ce47d1a\") " pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.706389 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f59f0-0093-4e6d-8aa3-0ddc0161b213-combined-ca-bundle\") pod \"barbican-keystone-listener-74944f4b54-xg775\" (UID: \"095f59f0-0093-4e6d-8aa3-0ddc0161b213\") " pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.708612 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/095f59f0-0093-4e6d-8aa3-0ddc0161b213-config-data-custom\") pod \"barbican-keystone-listener-74944f4b54-xg775\" (UID: \"095f59f0-0093-4e6d-8aa3-0ddc0161b213\") " pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.710810 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef127ba1-444d-4f1c-937b-965c7ce47d1a-config-data-custom\") pod \"barbican-worker-c9cfd9b5-p7l4w\" (UID: \"ef127ba1-444d-4f1c-937b-965c7ce47d1a\") " pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.713820 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7775b654cd-srptg"] Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.720884 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.725133 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7775b654cd-srptg"] Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.725880 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cdr2\" (UniqueName: \"kubernetes.io/projected/ef127ba1-444d-4f1c-937b-965c7ce47d1a-kube-api-access-5cdr2\") pod \"barbican-worker-c9cfd9b5-p7l4w\" (UID: \"ef127ba1-444d-4f1c-937b-965c7ce47d1a\") " pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.730757 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.733112 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbtc9\" (UniqueName: \"kubernetes.io/projected/095f59f0-0093-4e6d-8aa3-0ddc0161b213-kube-api-access-wbtc9\") pod \"barbican-keystone-listener-74944f4b54-xg775\" (UID: \"095f59f0-0093-4e6d-8aa3-0ddc0161b213\") " pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.799656 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c67ca8-13cb-492f-b37a-6034a8b4f18b-logs\") pod \"barbican-api-7775b654cd-srptg\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.799745 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-config-data\") pod \"barbican-api-7775b654cd-srptg\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.799804 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4c8x\" (UniqueName: \"kubernetes.io/projected/bf8182fd-592f-4c32-8270-a574eacb5638-kube-api-access-s4c8x\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.799847 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.799887 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-combined-ca-bundle\") pod \"barbican-api-7775b654cd-srptg\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.799914 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-config-data-custom\") pod \"barbican-api-7775b654cd-srptg\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.799954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.799983 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-dns-svc\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.800030 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.800066 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-config\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.800088 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsfzg\" (UniqueName: \"kubernetes.io/projected/42c67ca8-13cb-492f-b37a-6034a8b4f18b-kube-api-access-hsfzg\") pod \"barbican-api-7775b654cd-srptg\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.801473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.814324 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c9cfd9b5-p7l4w" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.831824 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4c8x\" (UniqueName: \"kubernetes.io/projected/bf8182fd-592f-4c32-8270-a574eacb5638-kube-api-access-s4c8x\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.848395 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.849433 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-dns-svc\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.849566 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-config\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.850261 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-kq6fr\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.871754 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-74944f4b54-xg775" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.901682 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c67ca8-13cb-492f-b37a-6034a8b4f18b-logs\") pod \"barbican-api-7775b654cd-srptg\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.901775 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-config-data\") pod \"barbican-api-7775b654cd-srptg\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.901897 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-combined-ca-bundle\") pod \"barbican-api-7775b654cd-srptg\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.901923 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-config-data-custom\") pod \"barbican-api-7775b654cd-srptg\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.902093 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsfzg\" (UniqueName: \"kubernetes.io/projected/42c67ca8-13cb-492f-b37a-6034a8b4f18b-kube-api-access-hsfzg\") pod \"barbican-api-7775b654cd-srptg\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.904743 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c67ca8-13cb-492f-b37a-6034a8b4f18b-logs\") pod \"barbican-api-7775b654cd-srptg\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.908404 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-combined-ca-bundle\") pod \"barbican-api-7775b654cd-srptg\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.910031 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-config-data\") pod \"barbican-api-7775b654cd-srptg\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.916185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-config-data-custom\") pod \"barbican-api-7775b654cd-srptg\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.933737 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsfzg\" (UniqueName: \"kubernetes.io/projected/42c67ca8-13cb-492f-b37a-6034a8b4f18b-kube-api-access-hsfzg\") pod \"barbican-api-7775b654cd-srptg\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:33 crc kubenswrapper[4743]: I1125 16:16:33.989987 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.063288 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.283620 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6dd8557654-lgr92" event={"ID":"659f8a21-e29e-47be-903b-742de8ec9b22","Type":"ContainerStarted","Data":"6ffa7b229dcdd03ac424c16f5e316702c37fa3f930174ccacf563b6d55717628"} Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.283949 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6dd8557654-lgr92" event={"ID":"659f8a21-e29e-47be-903b-742de8ec9b22","Type":"ContainerStarted","Data":"a9f25911ede2067fcf07df9a14bb6c2846ff7bc0b4718413ccd8ab23b85dbc4c"} Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.284847 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.284884 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.295701 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae733a9b-cace-4e47-8c89-0b1adf03600a","Type":"ContainerStarted","Data":"46838bad8c1361a4ae8274eea46d0829dfb5b46dcf6c017394bb3e7709383db3"} Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.573550 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6dd8557654-lgr92" podStartSLOduration=2.5735326240000003 podStartE2EDuration="2.573532624s" podCreationTimestamp="2025-11-25 16:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:34.30998876 +0000 UTC m=+1073.431828329" watchObservedRunningTime="2025-11-25 16:16:34.573532624 +0000 UTC m=+1073.695372173" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.574882 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-74944f4b54-xg775"] Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.583934 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-c9cfd9b5-p7l4w"] Nov 25 16:16:34 crc kubenswrapper[4743]: W1125 16:16:34.594089 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef127ba1_444d_4f1c_937b_965c7ce47d1a.slice/crio-5cda7fb4a7dfd28021551598aeb4a097247bdedcf0f644608b4bd361c6fcdae9 WatchSource:0}: Error finding container 5cda7fb4a7dfd28021551598aeb4a097247bdedcf0f644608b4bd361c6fcdae9: Status 404 returned error can't find the container with id 5cda7fb4a7dfd28021551598aeb4a097247bdedcf0f644608b4bd361c6fcdae9 Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.679448 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9vplz" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.725277 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-config-data\") pod \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.725362 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-combined-ca-bundle\") pod \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.725542 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmhln\" (UniqueName: \"kubernetes.io/projected/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-kube-api-access-rmhln\") pod \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.730061 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-scripts\") pod \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.730100 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-db-sync-config-data\") pod \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.730130 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-etc-machine-id\") pod \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\" (UID: \"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba\") " Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.730975 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba" (UID: "0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.736780 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba" (UID: "0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.738467 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-scripts" (OuterVolumeSpecName: "scripts") pod "0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba" (UID: "0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.738493 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-kube-api-access-rmhln" (OuterVolumeSpecName: "kube-api-access-rmhln") pod "0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba" (UID: "0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba"). InnerVolumeSpecName "kube-api-access-rmhln". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.756613 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba" (UID: "0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.785555 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-config-data" (OuterVolumeSpecName: "config-data") pod "0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba" (UID: "0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:34 crc kubenswrapper[4743]: W1125 16:16:34.817813 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42c67ca8_13cb_492f_b37a_6034a8b4f18b.slice/crio-69158f234c1d97347c16f3e8b149fd903d10e3ee231885cbb4076396b54595bf WatchSource:0}: Error finding container 69158f234c1d97347c16f3e8b149fd903d10e3ee231885cbb4076396b54595bf: Status 404 returned error can't find the container with id 69158f234c1d97347c16f3e8b149fd903d10e3ee231885cbb4076396b54595bf Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.819342 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7775b654cd-srptg"] Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.826739 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-kq6fr"] Nov 25 16:16:34 crc kubenswrapper[4743]: W1125 16:16:34.830677 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf8182fd_592f_4c32_8270_a574eacb5638.slice/crio-f02b9478b3e5ba69b2ed581384bb3973ed540ec82a018af781513d9f8c15cabc WatchSource:0}: Error finding container f02b9478b3e5ba69b2ed581384bb3973ed540ec82a018af781513d9f8c15cabc: Status 404 returned error can't find the container with id f02b9478b3e5ba69b2ed581384bb3973ed540ec82a018af781513d9f8c15cabc Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.832736 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmhln\" (UniqueName: \"kubernetes.io/projected/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-kube-api-access-rmhln\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.832756 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.832765 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.832774 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.832783 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:34 crc kubenswrapper[4743]: I1125 16:16:34.832792 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.305437 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7775b654cd-srptg" event={"ID":"42c67ca8-13cb-492f-b37a-6034a8b4f18b","Type":"ContainerStarted","Data":"98009a63f4539c6366807c5466c22f208c48db044be18b5f8559eddc1faf9d72"} Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.305841 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7775b654cd-srptg" event={"ID":"42c67ca8-13cb-492f-b37a-6034a8b4f18b","Type":"ContainerStarted","Data":"69158f234c1d97347c16f3e8b149fd903d10e3ee231885cbb4076396b54595bf"} Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.307381 4743 generic.go:334] "Generic (PLEG): container finished" podID="bf8182fd-592f-4c32-8270-a574eacb5638" containerID="1270cefdd71d45d8ff275f0666d11f7b39999fb327c62c89be28b6a053603b41" exitCode=0 Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.308361 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" event={"ID":"bf8182fd-592f-4c32-8270-a574eacb5638","Type":"ContainerDied","Data":"1270cefdd71d45d8ff275f0666d11f7b39999fb327c62c89be28b6a053603b41"} Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.308435 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" event={"ID":"bf8182fd-592f-4c32-8270-a574eacb5638","Type":"ContainerStarted","Data":"f02b9478b3e5ba69b2ed581384bb3973ed540ec82a018af781513d9f8c15cabc"} Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.309842 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-74944f4b54-xg775" event={"ID":"095f59f0-0093-4e6d-8aa3-0ddc0161b213","Type":"ContainerStarted","Data":"a7a608c642832b359e1a03e6decaaea2efeff2179dd11683aacdf300cbfc1996"} Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.311380 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c9cfd9b5-p7l4w" event={"ID":"ef127ba1-444d-4f1c-937b-965c7ce47d1a","Type":"ContainerStarted","Data":"5cda7fb4a7dfd28021551598aeb4a097247bdedcf0f644608b4bd361c6fcdae9"} Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.313188 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9vplz" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.313228 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9vplz" event={"ID":"0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba","Type":"ContainerDied","Data":"c87884a35cbd45f2a3b6895e6f2ac0dd58380afebe645319a92d2777ad61a61a"} Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.313245 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c87884a35cbd45f2a3b6895e6f2ac0dd58380afebe645319a92d2777ad61a61a" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.587037 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 16:16:35 crc kubenswrapper[4743]: E1125 16:16:35.587486 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba" containerName="cinder-db-sync" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.587504 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba" containerName="cinder-db-sync" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.587698 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba" containerName="cinder-db-sync" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.589246 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.592100 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-8fqv7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.592583 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.592769 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.592898 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.622879 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.643750 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-kq6fr"] Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.658669 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.658750 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cc321eb-76fb-487a-bc5a-84af0d390efa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.658778 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-config-data\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.658798 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmm57\" (UniqueName: \"kubernetes.io/projected/5cc321eb-76fb-487a-bc5a-84af0d390efa-kube-api-access-nmm57\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.658815 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-scripts\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.658850 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.679266 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-p6wg7"] Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.680774 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.695528 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-p6wg7"] Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.763811 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.763879 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.763904 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.763949 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cc321eb-76fb-487a-bc5a-84af0d390efa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.763972 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.764007 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-config-data\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.764043 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmm57\" (UniqueName: \"kubernetes.io/projected/5cc321eb-76fb-487a-bc5a-84af0d390efa-kube-api-access-nmm57\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.764063 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-scripts\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.764121 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.764161 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-config\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.764221 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns968\" (UniqueName: \"kubernetes.io/projected/f402dd15-78ae-4695-a91b-0cf339b16c76-kube-api-access-ns968\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.764280 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.770767 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cc321eb-76fb-487a-bc5a-84af0d390efa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.773301 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-config-data\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.776057 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.778217 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.785605 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-scripts\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.794802 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmm57\" (UniqueName: \"kubernetes.io/projected/5cc321eb-76fb-487a-bc5a-84af0d390efa-kube-api-access-nmm57\") pod \"cinder-scheduler-0\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.820125 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.824915 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.828104 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.835447 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.865346 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-config\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.865685 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns968\" (UniqueName: \"kubernetes.io/projected/f402dd15-78ae-4695-a91b-0cf339b16c76-kube-api-access-ns968\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.865860 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.865957 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.866062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.866161 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.867262 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.867609 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.868026 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.868208 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.868386 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-config\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.895302 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns968\" (UniqueName: \"kubernetes.io/projected/f402dd15-78ae-4695-a91b-0cf339b16c76-kube-api-access-ns968\") pod \"dnsmasq-dns-5c9776ccc5-p6wg7\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.922438 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.971725 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.971792 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-config-data\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.971836 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndklb\" (UniqueName: \"kubernetes.io/projected/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-kube-api-access-ndklb\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.971859 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.983496 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-logs\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.983658 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-config-data-custom\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:35 crc kubenswrapper[4743]: I1125 16:16:35.983733 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-scripts\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.038422 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.089414 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-config-data-custom\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.089485 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-scripts\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.089614 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.089708 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-config-data\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.091549 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.093202 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndklb\" (UniqueName: \"kubernetes.io/projected/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-kube-api-access-ndklb\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.093255 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.093353 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-logs\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.093848 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-logs\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.095331 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-config-data-custom\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.097530 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-scripts\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.099048 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.109753 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-config-data\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.116740 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndklb\" (UniqueName: \"kubernetes.io/projected/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-kube-api-access-ndklb\") pod \"cinder-api-0\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.148204 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.272759 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66f797f6cb-zd4ck" podUID="65514eee-0e20-40f2-b381-21311ae5e899" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:49386->10.217.0.147:8443: read: connection reset by peer" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.338876 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7775b654cd-srptg" event={"ID":"42c67ca8-13cb-492f-b37a-6034a8b4f18b","Type":"ContainerStarted","Data":"e7b0c5288d719c79d924fb9a41486b1572517d2137bad15b507a0cdbe4f3a866"} Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.338995 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.339021 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.345296 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" podUID="bf8182fd-592f-4c32-8270-a574eacb5638" containerName="dnsmasq-dns" containerID="cri-o://c53caca7d07a07f9d55d1062d2c96b6a0742a755f21faeb5ed134845cf98bfc3" gracePeriod=10 Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.345286 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" event={"ID":"bf8182fd-592f-4c32-8270-a574eacb5638","Type":"ContainerStarted","Data":"c53caca7d07a07f9d55d1062d2c96b6a0742a755f21faeb5ed134845cf98bfc3"} Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.345433 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.370433 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7775b654cd-srptg" podStartSLOduration=3.3704097490000002 podStartE2EDuration="3.370409749s" podCreationTimestamp="2025-11-25 16:16:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:36.36405107 +0000 UTC m=+1075.485890629" watchObservedRunningTime="2025-11-25 16:16:36.370409749 +0000 UTC m=+1075.492249288" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.379104 4743 generic.go:334] "Generic (PLEG): container finished" podID="65514eee-0e20-40f2-b381-21311ae5e899" containerID="c7efc3bf5ec0a3a745508cfbeedbf2a676ec3389b99cbb20a3ba37972d4920be" exitCode=0 Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.379167 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f797f6cb-zd4ck" event={"ID":"65514eee-0e20-40f2-b381-21311ae5e899","Type":"ContainerDied","Data":"c7efc3bf5ec0a3a745508cfbeedbf2a676ec3389b99cbb20a3ba37972d4920be"} Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.391969 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" podStartSLOduration=3.391948115 podStartE2EDuration="3.391948115s" podCreationTimestamp="2025-11-25 16:16:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:36.384089449 +0000 UTC m=+1075.505928998" watchObservedRunningTime="2025-11-25 16:16:36.391948115 +0000 UTC m=+1075.513787664" Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.392802 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae733a9b-cace-4e47-8c89-0b1adf03600a","Type":"ContainerStarted","Data":"80abc756fcd46070ea658aac16e6a33f0e270fa22a756655123278547e25555c"} Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.590069 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 16:16:36 crc kubenswrapper[4743]: I1125 16:16:36.666380 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-p6wg7"] Nov 25 16:16:36 crc kubenswrapper[4743]: W1125 16:16:36.897062 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cc321eb_76fb_487a_bc5a_84af0d390efa.slice/crio-0042bc0146cb55883cca305b6c791f2916469fb8acf76740157bc7fee3f8a48c WatchSource:0}: Error finding container 0042bc0146cb55883cca305b6c791f2916469fb8acf76740157bc7fee3f8a48c: Status 404 returned error can't find the container with id 0042bc0146cb55883cca305b6c791f2916469fb8acf76740157bc7fee3f8a48c Nov 25 16:16:36 crc kubenswrapper[4743]: W1125 16:16:36.900445 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf402dd15_78ae_4695_a91b_0cf339b16c76.slice/crio-d5a561bf4fef0d4f43d96dad6ca09b86882798dd4df3a86fd0a5b514405d16e5 WatchSource:0}: Error finding container d5a561bf4fef0d4f43d96dad6ca09b86882798dd4df3a86fd0a5b514405d16e5: Status 404 returned error can't find the container with id d5a561bf4fef0d4f43d96dad6ca09b86882798dd4df3a86fd0a5b514405d16e5 Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.370145 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.407568 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" event={"ID":"f402dd15-78ae-4695-a91b-0cf339b16c76","Type":"ContainerStarted","Data":"d5a561bf4fef0d4f43d96dad6ca09b86882798dd4df3a86fd0a5b514405d16e5"} Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.410665 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a","Type":"ContainerStarted","Data":"9102f2b58bc41799d16dd2c669a1bd25acb4f006ca393a46bd727cc1682b1bae"} Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.429663 4743 generic.go:334] "Generic (PLEG): container finished" podID="bf8182fd-592f-4c32-8270-a574eacb5638" containerID="c53caca7d07a07f9d55d1062d2c96b6a0742a755f21faeb5ed134845cf98bfc3" exitCode=0 Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.429724 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" event={"ID":"bf8182fd-592f-4c32-8270-a574eacb5638","Type":"ContainerDied","Data":"c53caca7d07a07f9d55d1062d2c96b6a0742a755f21faeb5ed134845cf98bfc3"} Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.429750 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" event={"ID":"bf8182fd-592f-4c32-8270-a574eacb5638","Type":"ContainerDied","Data":"f02b9478b3e5ba69b2ed581384bb3973ed540ec82a018af781513d9f8c15cabc"} Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.429760 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f02b9478b3e5ba69b2ed581384bb3973ed540ec82a018af781513d9f8c15cabc" Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.432073 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5cc321eb-76fb-487a-bc5a-84af0d390efa","Type":"ContainerStarted","Data":"0042bc0146cb55883cca305b6c791f2916469fb8acf76740157bc7fee3f8a48c"} Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.519935 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.621128 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-dns-swift-storage-0\") pod \"bf8182fd-592f-4c32-8270-a574eacb5638\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.621480 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-ovsdbserver-nb\") pod \"bf8182fd-592f-4c32-8270-a574eacb5638\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.621509 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4c8x\" (UniqueName: \"kubernetes.io/projected/bf8182fd-592f-4c32-8270-a574eacb5638-kube-api-access-s4c8x\") pod \"bf8182fd-592f-4c32-8270-a574eacb5638\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.621557 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-ovsdbserver-sb\") pod \"bf8182fd-592f-4c32-8270-a574eacb5638\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.621635 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-config\") pod \"bf8182fd-592f-4c32-8270-a574eacb5638\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.621700 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-dns-svc\") pod \"bf8182fd-592f-4c32-8270-a574eacb5638\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.626616 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf8182fd-592f-4c32-8270-a574eacb5638-kube-api-access-s4c8x" (OuterVolumeSpecName: "kube-api-access-s4c8x") pod "bf8182fd-592f-4c32-8270-a574eacb5638" (UID: "bf8182fd-592f-4c32-8270-a574eacb5638"). InnerVolumeSpecName "kube-api-access-s4c8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.706458 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bf8182fd-592f-4c32-8270-a574eacb5638" (UID: "bf8182fd-592f-4c32-8270-a574eacb5638"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.719703 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bf8182fd-592f-4c32-8270-a574eacb5638" (UID: "bf8182fd-592f-4c32-8270-a574eacb5638"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.722971 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-config" (OuterVolumeSpecName: "config") pod "bf8182fd-592f-4c32-8270-a574eacb5638" (UID: "bf8182fd-592f-4c32-8270-a574eacb5638"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.723281 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-config\") pod \"bf8182fd-592f-4c32-8270-a574eacb5638\" (UID: \"bf8182fd-592f-4c32-8270-a574eacb5638\") " Nov 25 16:16:37 crc kubenswrapper[4743]: W1125 16:16:37.723628 4743 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bf8182fd-592f-4c32-8270-a574eacb5638/volumes/kubernetes.io~configmap/config Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.723657 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-config" (OuterVolumeSpecName: "config") pod "bf8182fd-592f-4c32-8270-a574eacb5638" (UID: "bf8182fd-592f-4c32-8270-a574eacb5638"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.723955 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.724002 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4c8x\" (UniqueName: \"kubernetes.io/projected/bf8182fd-592f-4c32-8270-a574eacb5638-kube-api-access-s4c8x\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.724013 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.724026 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.726937 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bf8182fd-592f-4c32-8270-a574eacb5638" (UID: "bf8182fd-592f-4c32-8270-a574eacb5638"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.741914 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bf8182fd-592f-4c32-8270-a574eacb5638" (UID: "bf8182fd-592f-4c32-8270-a574eacb5638"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.825493 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:37 crc kubenswrapper[4743]: I1125 16:16:37.825528 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf8182fd-592f-4c32-8270-a574eacb5638-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.457912 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c9cfd9b5-p7l4w" event={"ID":"ef127ba1-444d-4f1c-937b-965c7ce47d1a","Type":"ContainerStarted","Data":"5c7b9a7d8584fea07cc5e92879ff1b0d5b7f2d1cfabdc0eb66d14886a4ff1707"} Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.458394 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c9cfd9b5-p7l4w" event={"ID":"ef127ba1-444d-4f1c-937b-965c7ce47d1a","Type":"ContainerStarted","Data":"999d18801a60d6898049b602c952207e45f4ae854452700c31cfdca1d2d85e0e"} Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.468027 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5cc321eb-76fb-487a-bc5a-84af0d390efa","Type":"ContainerStarted","Data":"21c8691b2810c3da3a2e204493e9391ffdfd104ec8a9fe2590de23bc57233c8e"} Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.494036 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-c9cfd9b5-p7l4w" podStartSLOduration=2.688958564 podStartE2EDuration="5.494007813s" podCreationTimestamp="2025-11-25 16:16:33 +0000 UTC" firstStartedPulling="2025-11-25 16:16:34.596994931 +0000 UTC m=+1073.718834480" lastFinishedPulling="2025-11-25 16:16:37.40204418 +0000 UTC m=+1076.523883729" observedRunningTime="2025-11-25 16:16:38.481872062 +0000 UTC m=+1077.603711611" watchObservedRunningTime="2025-11-25 16:16:38.494007813 +0000 UTC m=+1077.615847362" Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.509970 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae733a9b-cace-4e47-8c89-0b1adf03600a","Type":"ContainerStarted","Data":"d1513f2957d6f7fa43224017686c56c664bd76898c0dc674dff7f057da23d2af"} Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.510075 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.518104 4743 generic.go:334] "Generic (PLEG): container finished" podID="f402dd15-78ae-4695-a91b-0cf339b16c76" containerID="45f0c5c98959884ee5c801b099ed5bff8e3a3cf8055561f103641ff96be46cb5" exitCode=0 Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.518175 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" event={"ID":"f402dd15-78ae-4695-a91b-0cf339b16c76","Type":"ContainerDied","Data":"45f0c5c98959884ee5c801b099ed5bff8e3a3cf8055561f103641ff96be46cb5"} Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.531312 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a","Type":"ContainerStarted","Data":"e48e338415b25860bb1565afab057610e27feac04084af6f6914cf5f57b15007"} Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.539391 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-kq6fr" Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.539390 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-74944f4b54-xg775" event={"ID":"095f59f0-0093-4e6d-8aa3-0ddc0161b213","Type":"ContainerStarted","Data":"e9d135d57b9a6103ca182bcfd01ed2dfe4700e8a573cb9d7c82c2fe80a87219e"} Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.539487 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-74944f4b54-xg775" event={"ID":"095f59f0-0093-4e6d-8aa3-0ddc0161b213","Type":"ContainerStarted","Data":"44ffa90042083d32317f318b16d659c278858407ab7d56abcdb465c2e2e8a07c"} Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.543219 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.192223836 podStartE2EDuration="7.543205758s" podCreationTimestamp="2025-11-25 16:16:31 +0000 UTC" firstStartedPulling="2025-11-25 16:16:32.056478838 +0000 UTC m=+1071.178318387" lastFinishedPulling="2025-11-25 16:16:37.40746076 +0000 UTC m=+1076.529300309" observedRunningTime="2025-11-25 16:16:38.538975304 +0000 UTC m=+1077.660814863" watchObservedRunningTime="2025-11-25 16:16:38.543205758 +0000 UTC m=+1077.665045307" Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.717266 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-kq6fr"] Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.737158 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-kq6fr"] Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.743570 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 25 16:16:38 crc kubenswrapper[4743]: I1125 16:16:38.748652 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-74944f4b54-xg775" podStartSLOduration=2.918806752 podStartE2EDuration="5.748628558s" podCreationTimestamp="2025-11-25 16:16:33 +0000 UTC" firstStartedPulling="2025-11-25 16:16:34.576982573 +0000 UTC m=+1073.698822112" lastFinishedPulling="2025-11-25 16:16:37.406804369 +0000 UTC m=+1076.528643918" observedRunningTime="2025-11-25 16:16:38.649720622 +0000 UTC m=+1077.771560181" watchObservedRunningTime="2025-11-25 16:16:38.748628558 +0000 UTC m=+1077.870468107" Nov 25 16:16:39 crc kubenswrapper[4743]: I1125 16:16:39.173129 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66f797f6cb-zd4ck" podUID="65514eee-0e20-40f2-b381-21311ae5e899" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 25 16:16:39 crc kubenswrapper[4743]: I1125 16:16:39.553803 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" event={"ID":"f402dd15-78ae-4695-a91b-0cf339b16c76","Type":"ContainerStarted","Data":"f78541f79cdc42aa719c3782039125f449e272a16e70e225e6a7b631a21b6619"} Nov 25 16:16:39 crc kubenswrapper[4743]: I1125 16:16:39.554329 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:39 crc kubenswrapper[4743]: I1125 16:16:39.557410 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a","Type":"ContainerStarted","Data":"bab5bf52c2c1310b35ae776a92d905d8f4f52cf940b16c40a81f03ee83cb5a87"} Nov 25 16:16:39 crc kubenswrapper[4743]: I1125 16:16:39.557528 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 25 16:16:39 crc kubenswrapper[4743]: I1125 16:16:39.557563 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" containerName="cinder-api-log" containerID="cri-o://e48e338415b25860bb1565afab057610e27feac04084af6f6914cf5f57b15007" gracePeriod=30 Nov 25 16:16:39 crc kubenswrapper[4743]: I1125 16:16:39.557603 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" containerName="cinder-api" containerID="cri-o://bab5bf52c2c1310b35ae776a92d905d8f4f52cf940b16c40a81f03ee83cb5a87" gracePeriod=30 Nov 25 16:16:39 crc kubenswrapper[4743]: I1125 16:16:39.559678 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5cc321eb-76fb-487a-bc5a-84af0d390efa","Type":"ContainerStarted","Data":"5d47d5f3333944896e693e1b77259b278a2003abcc86b6ba64d87ab230e708ce"} Nov 25 16:16:39 crc kubenswrapper[4743]: I1125 16:16:39.594929 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" podStartSLOduration=4.594910257 podStartE2EDuration="4.594910257s" podCreationTimestamp="2025-11-25 16:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:39.592574254 +0000 UTC m=+1078.714413813" watchObservedRunningTime="2025-11-25 16:16:39.594910257 +0000 UTC m=+1078.716749806" Nov 25 16:16:39 crc kubenswrapper[4743]: I1125 16:16:39.663373 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.805461742 podStartE2EDuration="4.663351086s" podCreationTimestamp="2025-11-25 16:16:35 +0000 UTC" firstStartedPulling="2025-11-25 16:16:36.903832978 +0000 UTC m=+1076.025672517" lastFinishedPulling="2025-11-25 16:16:37.761722312 +0000 UTC m=+1076.883561861" observedRunningTime="2025-11-25 16:16:39.657125211 +0000 UTC m=+1078.778964760" watchObservedRunningTime="2025-11-25 16:16:39.663351086 +0000 UTC m=+1078.785190645" Nov 25 16:16:39 crc kubenswrapper[4743]: I1125 16:16:39.714160 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.714142661 podStartE2EDuration="4.714142661s" podCreationTimestamp="2025-11-25 16:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:39.709224776 +0000 UTC m=+1078.831064325" watchObservedRunningTime="2025-11-25 16:16:39.714142661 +0000 UTC m=+1078.835982210" Nov 25 16:16:39 crc kubenswrapper[4743]: I1125 16:16:39.785261 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf8182fd-592f-4c32-8270-a574eacb5638" path="/var/lib/kubelet/pods/bf8182fd-592f-4c32-8270-a574eacb5638/volumes" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.267954 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-85dc5d687d-qkdzh"] Nov 25 16:16:40 crc kubenswrapper[4743]: E1125 16:16:40.268338 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf8182fd-592f-4c32-8270-a574eacb5638" containerName="dnsmasq-dns" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.268355 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8182fd-592f-4c32-8270-a574eacb5638" containerName="dnsmasq-dns" Nov 25 16:16:40 crc kubenswrapper[4743]: E1125 16:16:40.268378 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf8182fd-592f-4c32-8270-a574eacb5638" containerName="init" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.268384 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8182fd-592f-4c32-8270-a574eacb5638" containerName="init" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.268562 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf8182fd-592f-4c32-8270-a574eacb5638" containerName="dnsmasq-dns" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.269542 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.271896 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.272029 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.293022 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85dc5d687d-qkdzh"] Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.419921 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-combined-ca-bundle\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.420038 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-public-tls-certs\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.420095 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgv2d\" (UniqueName: \"kubernetes.io/projected/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-kube-api-access-vgv2d\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.420115 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-config-data-custom\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.420317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-config-data\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.420646 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-internal-tls-certs\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.420736 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-logs\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.523208 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgv2d\" (UniqueName: \"kubernetes.io/projected/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-kube-api-access-vgv2d\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.523818 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-config-data-custom\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.523908 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-config-data\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.524009 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-internal-tls-certs\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.524047 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-logs\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.524070 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-combined-ca-bundle\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.524119 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-public-tls-certs\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.531239 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-logs\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.536154 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-public-tls-certs\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.536153 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-config-data-custom\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.540405 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-internal-tls-certs\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.551416 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-combined-ca-bundle\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.552076 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-config-data\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.558178 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgv2d\" (UniqueName: \"kubernetes.io/projected/f3984c1f-c5d2-4a6a-9058-4c272455dcd8-kube-api-access-vgv2d\") pod \"barbican-api-85dc5d687d-qkdzh\" (UID: \"f3984c1f-c5d2-4a6a-9058-4c272455dcd8\") " pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.591482 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.596479 4743 generic.go:334] "Generic (PLEG): container finished" podID="84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" containerID="bab5bf52c2c1310b35ae776a92d905d8f4f52cf940b16c40a81f03ee83cb5a87" exitCode=0 Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.596549 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a","Type":"ContainerDied","Data":"bab5bf52c2c1310b35ae776a92d905d8f4f52cf940b16c40a81f03ee83cb5a87"} Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.596601 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a","Type":"ContainerDied","Data":"e48e338415b25860bb1565afab057610e27feac04084af6f6914cf5f57b15007"} Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.596568 4743 generic.go:334] "Generic (PLEG): container finished" podID="84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" containerID="e48e338415b25860bb1565afab057610e27feac04084af6f6914cf5f57b15007" exitCode=143 Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.800152 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.923714 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.940104 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-config-data\") pod \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.940167 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-scripts\") pod \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.940307 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-etc-machine-id\") pod \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.940330 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-logs\") pod \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.940388 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-combined-ca-bundle\") pod \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.940463 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndklb\" (UniqueName: \"kubernetes.io/projected/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-kube-api-access-ndklb\") pod \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.940500 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-config-data-custom\") pod \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\" (UID: \"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a\") " Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.941656 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" (UID: "84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.942245 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-logs" (OuterVolumeSpecName: "logs") pod "84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" (UID: "84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.950284 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" (UID: "84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.960662 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-kube-api-access-ndklb" (OuterVolumeSpecName: "kube-api-access-ndklb") pod "84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" (UID: "84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a"). InnerVolumeSpecName "kube-api-access-ndklb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.960771 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-scripts" (OuterVolumeSpecName: "scripts") pod "84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" (UID: "84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:40 crc kubenswrapper[4743]: I1125 16:16:40.996653 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" (UID: "84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.022307 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-config-data" (OuterVolumeSpecName: "config-data") pod "84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" (UID: "84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.042984 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.043024 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndklb\" (UniqueName: \"kubernetes.io/projected/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-kube-api-access-ndklb\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.043041 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.043053 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.043064 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.043075 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.043088 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.188630 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85dc5d687d-qkdzh"] Nov 25 16:16:41 crc kubenswrapper[4743]: W1125 16:16:41.198521 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3984c1f_c5d2_4a6a_9058_4c272455dcd8.slice/crio-d2c435621b6fa262b882b156fb650ce4b4beb6f4996946a6dad2f84d60733785 WatchSource:0}: Error finding container d2c435621b6fa262b882b156fb650ce4b4beb6f4996946a6dad2f84d60733785: Status 404 returned error can't find the container with id d2c435621b6fa262b882b156fb650ce4b4beb6f4996946a6dad2f84d60733785 Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.606062 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85dc5d687d-qkdzh" event={"ID":"f3984c1f-c5d2-4a6a-9058-4c272455dcd8","Type":"ContainerStarted","Data":"174f0f1d77f42291bba66afbe75a26df575ba0eb325293b4ac4bd1d0192f0896"} Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.606341 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85dc5d687d-qkdzh" event={"ID":"f3984c1f-c5d2-4a6a-9058-4c272455dcd8","Type":"ContainerStarted","Data":"d2c435621b6fa262b882b156fb650ce4b4beb6f4996946a6dad2f84d60733785"} Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.608694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a","Type":"ContainerDied","Data":"9102f2b58bc41799d16dd2c669a1bd25acb4f006ca393a46bd727cc1682b1bae"} Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.608750 4743 scope.go:117] "RemoveContainer" containerID="bab5bf52c2c1310b35ae776a92d905d8f4f52cf940b16c40a81f03ee83cb5a87" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.608747 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.664219 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.697999 4743 scope.go:117] "RemoveContainer" containerID="e48e338415b25860bb1565afab057610e27feac04084af6f6914cf5f57b15007" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.709785 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.735754 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 25 16:16:41 crc kubenswrapper[4743]: E1125 16:16:41.736224 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" containerName="cinder-api-log" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.736242 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" containerName="cinder-api-log" Nov 25 16:16:41 crc kubenswrapper[4743]: E1125 16:16:41.736262 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" containerName="cinder-api" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.736269 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" containerName="cinder-api" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.736455 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" containerName="cinder-api-log" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.736476 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" containerName="cinder-api" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.737513 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.742953 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.743188 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.743629 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.755644 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.793425 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a" path="/var/lib/kubelet/pods/84bd9eba-ffd7-4d4c-ba8c-c9c7efb7645a/volumes" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.864667 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.864713 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.864773 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f8e01616-0594-420e-9180-2c348780903a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.864789 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e01616-0594-420e-9180-2c348780903a-logs\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.864816 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.864836 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-scripts\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.864917 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.864950 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8ppc\" (UniqueName: \"kubernetes.io/projected/f8e01616-0594-420e-9180-2c348780903a-kube-api-access-x8ppc\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.864992 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-config-data\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.966642 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.966724 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8ppc\" (UniqueName: \"kubernetes.io/projected/f8e01616-0594-420e-9180-2c348780903a-kube-api-access-x8ppc\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.966780 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-config-data\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.966838 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.966874 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.966920 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f8e01616-0594-420e-9180-2c348780903a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.966943 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e01616-0594-420e-9180-2c348780903a-logs\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.966979 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.967011 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-scripts\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.967132 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f8e01616-0594-420e-9180-2c348780903a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.967565 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e01616-0594-420e-9180-2c348780903a-logs\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.972554 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.972585 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-config-data\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.973043 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-scripts\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.973229 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.974120 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.974502 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e01616-0594-420e-9180-2c348780903a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:41 crc kubenswrapper[4743]: I1125 16:16:41.990570 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8ppc\" (UniqueName: \"kubernetes.io/projected/f8e01616-0594-420e-9180-2c348780903a-kube-api-access-x8ppc\") pod \"cinder-api-0\" (UID: \"f8e01616-0594-420e-9180-2c348780903a\") " pod="openstack/cinder-api-0" Nov 25 16:16:42 crc kubenswrapper[4743]: I1125 16:16:42.067120 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 16:16:42 crc kubenswrapper[4743]: I1125 16:16:42.634559 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85dc5d687d-qkdzh" event={"ID":"f3984c1f-c5d2-4a6a-9058-4c272455dcd8","Type":"ContainerStarted","Data":"e2a79c5a33f3d128d3ede769f36f0d73035c214836a716fd67899a6ef10af649"} Nov 25 16:16:42 crc kubenswrapper[4743]: I1125 16:16:42.635397 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:42 crc kubenswrapper[4743]: I1125 16:16:42.635427 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:42 crc kubenswrapper[4743]: I1125 16:16:42.654222 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 16:16:42 crc kubenswrapper[4743]: I1125 16:16:42.668171 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-85dc5d687d-qkdzh" podStartSLOduration=2.668144086 podStartE2EDuration="2.668144086s" podCreationTimestamp="2025-11-25 16:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:42.656985506 +0000 UTC m=+1081.778825065" watchObservedRunningTime="2025-11-25 16:16:42.668144086 +0000 UTC m=+1081.789983635" Nov 25 16:16:43 crc kubenswrapper[4743]: I1125 16:16:43.650882 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f8e01616-0594-420e-9180-2c348780903a","Type":"ContainerStarted","Data":"0db883aa819317722e304ddcc4842fc5945ffd200672117184b3923bd83de9f6"} Nov 25 16:16:43 crc kubenswrapper[4743]: I1125 16:16:43.651235 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f8e01616-0594-420e-9180-2c348780903a","Type":"ContainerStarted","Data":"8abf39d75e31706a98045cfe8ef95e3b212ded06dfa76480b51d572c2ff17560"} Nov 25 16:16:44 crc kubenswrapper[4743]: I1125 16:16:44.662079 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f8e01616-0594-420e-9180-2c348780903a","Type":"ContainerStarted","Data":"e25f1476a01bba8ee02a7102a4fabab6e0f4e85315dc5284efb51c9b7d74bb30"} Nov 25 16:16:44 crc kubenswrapper[4743]: I1125 16:16:44.662258 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 25 16:16:44 crc kubenswrapper[4743]: I1125 16:16:44.686327 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.686303139 podStartE2EDuration="3.686303139s" podCreationTimestamp="2025-11-25 16:16:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:44.678163584 +0000 UTC m=+1083.800003173" watchObservedRunningTime="2025-11-25 16:16:44.686303139 +0000 UTC m=+1083.808142698" Nov 25 16:16:45 crc kubenswrapper[4743]: I1125 16:16:45.437407 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:45 crc kubenswrapper[4743]: I1125 16:16:45.470711 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.040714 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.123403 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8r569"] Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.123845 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-8r569" podUID="07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" containerName="dnsmasq-dns" containerID="cri-o://afbe6624778d75a40282fbb87b84b05faa4cbcac0b563e94119c4f77bc0eaa58" gracePeriod=10 Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.190793 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.229024 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 16:16:46 crc kubenswrapper[4743]: E1125 16:16:46.316010 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07384a4d_5f9c_4902_a4e8_4be4d7e4b8a0.slice/crio-conmon-afbe6624778d75a40282fbb87b84b05faa4cbcac0b563e94119c4f77bc0eaa58.scope\": RecentStats: unable to find data in memory cache]" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.582177 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.644331 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.712170 4743 generic.go:334] "Generic (PLEG): container finished" podID="07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" containerID="afbe6624778d75a40282fbb87b84b05faa4cbcac0b563e94119c4f77bc0eaa58" exitCode=0 Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.712239 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-8r569" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.712324 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8r569" event={"ID":"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0","Type":"ContainerDied","Data":"afbe6624778d75a40282fbb87b84b05faa4cbcac0b563e94119c4f77bc0eaa58"} Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.712373 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8r569" event={"ID":"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0","Type":"ContainerDied","Data":"a47d569b8e5c477800ce1752e373edfd9af0026cc1355c507f82a3b2919923f9"} Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.712394 4743 scope.go:117] "RemoveContainer" containerID="afbe6624778d75a40282fbb87b84b05faa4cbcac0b563e94119c4f77bc0eaa58" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.713081 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5cc321eb-76fb-487a-bc5a-84af0d390efa" containerName="cinder-scheduler" containerID="cri-o://21c8691b2810c3da3a2e204493e9391ffdfd104ec8a9fe2590de23bc57233c8e" gracePeriod=30 Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.713166 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5cc321eb-76fb-487a-bc5a-84af0d390efa" containerName="probe" containerID="cri-o://5d47d5f3333944896e693e1b77259b278a2003abcc86b6ba64d87ab230e708ce" gracePeriod=30 Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.747495 4743 scope.go:117] "RemoveContainer" containerID="ad5bfaf49e6da92508b2c0d1ec7d239dd71259581eebcd7c62ac46d3f9dc182c" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.778289 4743 scope.go:117] "RemoveContainer" containerID="afbe6624778d75a40282fbb87b84b05faa4cbcac0b563e94119c4f77bc0eaa58" Nov 25 16:16:46 crc kubenswrapper[4743]: E1125 16:16:46.779995 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afbe6624778d75a40282fbb87b84b05faa4cbcac0b563e94119c4f77bc0eaa58\": container with ID starting with afbe6624778d75a40282fbb87b84b05faa4cbcac0b563e94119c4f77bc0eaa58 not found: ID does not exist" containerID="afbe6624778d75a40282fbb87b84b05faa4cbcac0b563e94119c4f77bc0eaa58" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.780074 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afbe6624778d75a40282fbb87b84b05faa4cbcac0b563e94119c4f77bc0eaa58"} err="failed to get container status \"afbe6624778d75a40282fbb87b84b05faa4cbcac0b563e94119c4f77bc0eaa58\": rpc error: code = NotFound desc = could not find container \"afbe6624778d75a40282fbb87b84b05faa4cbcac0b563e94119c4f77bc0eaa58\": container with ID starting with afbe6624778d75a40282fbb87b84b05faa4cbcac0b563e94119c4f77bc0eaa58 not found: ID does not exist" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.780101 4743 scope.go:117] "RemoveContainer" containerID="ad5bfaf49e6da92508b2c0d1ec7d239dd71259581eebcd7c62ac46d3f9dc182c" Nov 25 16:16:46 crc kubenswrapper[4743]: E1125 16:16:46.782208 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad5bfaf49e6da92508b2c0d1ec7d239dd71259581eebcd7c62ac46d3f9dc182c\": container with ID starting with ad5bfaf49e6da92508b2c0d1ec7d239dd71259581eebcd7c62ac46d3f9dc182c not found: ID does not exist" containerID="ad5bfaf49e6da92508b2c0d1ec7d239dd71259581eebcd7c62ac46d3f9dc182c" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.782229 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad5bfaf49e6da92508b2c0d1ec7d239dd71259581eebcd7c62ac46d3f9dc182c"} err="failed to get container status \"ad5bfaf49e6da92508b2c0d1ec7d239dd71259581eebcd7c62ac46d3f9dc182c\": rpc error: code = NotFound desc = could not find container \"ad5bfaf49e6da92508b2c0d1ec7d239dd71259581eebcd7c62ac46d3f9dc182c\": container with ID starting with ad5bfaf49e6da92508b2c0d1ec7d239dd71259581eebcd7c62ac46d3f9dc182c not found: ID does not exist" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.785250 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-ovsdbserver-nb\") pod \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.785319 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhmjc\" (UniqueName: \"kubernetes.io/projected/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-kube-api-access-lhmjc\") pod \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.785350 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-config\") pod \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.785433 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-dns-svc\") pod \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.785471 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-dns-swift-storage-0\") pod \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.785515 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-ovsdbserver-sb\") pod \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\" (UID: \"07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0\") " Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.798645 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-kube-api-access-lhmjc" (OuterVolumeSpecName: "kube-api-access-lhmjc") pod "07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" (UID: "07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0"). InnerVolumeSpecName "kube-api-access-lhmjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.871384 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-config" (OuterVolumeSpecName: "config") pod "07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" (UID: "07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.888804 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhmjc\" (UniqueName: \"kubernetes.io/projected/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-kube-api-access-lhmjc\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.889099 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.917805 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" (UID: "07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.938426 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" (UID: "07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.941082 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" (UID: "07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.946956 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" (UID: "07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.990654 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.990686 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.990696 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:46 crc kubenswrapper[4743]: I1125 16:16:46.990704 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:47 crc kubenswrapper[4743]: I1125 16:16:47.082914 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8r569"] Nov 25 16:16:47 crc kubenswrapper[4743]: I1125 16:16:47.090056 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8r569"] Nov 25 16:16:47 crc kubenswrapper[4743]: I1125 16:16:47.726935 4743 generic.go:334] "Generic (PLEG): container finished" podID="5cc321eb-76fb-487a-bc5a-84af0d390efa" containerID="5d47d5f3333944896e693e1b77259b278a2003abcc86b6ba64d87ab230e708ce" exitCode=0 Nov 25 16:16:47 crc kubenswrapper[4743]: I1125 16:16:47.727035 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5cc321eb-76fb-487a-bc5a-84af0d390efa","Type":"ContainerDied","Data":"5d47d5f3333944896e693e1b77259b278a2003abcc86b6ba64d87ab230e708ce"} Nov 25 16:16:47 crc kubenswrapper[4743]: I1125 16:16:47.786823 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" path="/var/lib/kubelet/pods/07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0/volumes" Nov 25 16:16:48 crc kubenswrapper[4743]: I1125 16:16:48.231055 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-848747fd7b-bljn8" Nov 25 16:16:49 crc kubenswrapper[4743]: I1125 16:16:49.173719 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66f797f6cb-zd4ck" podUID="65514eee-0e20-40f2-b381-21311ae5e899" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 25 16:16:49 crc kubenswrapper[4743]: I1125 16:16:49.295711 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-cc5fc48dc-hkvc8" Nov 25 16:16:49 crc kubenswrapper[4743]: I1125 16:16:49.363287 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b9d9d486d-xrstf"] Nov 25 16:16:49 crc kubenswrapper[4743]: I1125 16:16:49.363528 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b9d9d486d-xrstf" podUID="a08f6f91-e10d-432d-b8da-acd9f692e6bd" containerName="neutron-api" containerID="cri-o://58187b5c9feb18d00ba0264824147f6af105b5a3d44b7de95c83e9560e5c5d44" gracePeriod=30 Nov 25 16:16:49 crc kubenswrapper[4743]: I1125 16:16:49.363809 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b9d9d486d-xrstf" podUID="a08f6f91-e10d-432d-b8da-acd9f692e6bd" containerName="neutron-httpd" containerID="cri-o://2f37558a404668862a517b9a4db9d7f9477c4728e9e8ee51cae5b26209ac8de4" gracePeriod=30 Nov 25 16:16:49 crc kubenswrapper[4743]: I1125 16:16:49.748325 4743 generic.go:334] "Generic (PLEG): container finished" podID="a08f6f91-e10d-432d-b8da-acd9f692e6bd" containerID="2f37558a404668862a517b9a4db9d7f9477c4728e9e8ee51cae5b26209ac8de4" exitCode=0 Nov 25 16:16:49 crc kubenswrapper[4743]: I1125 16:16:49.748418 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b9d9d486d-xrstf" event={"ID":"a08f6f91-e10d-432d-b8da-acd9f692e6bd","Type":"ContainerDied","Data":"2f37558a404668862a517b9a4db9d7f9477c4728e9e8ee51cae5b26209ac8de4"} Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.005700 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 25 16:16:50 crc kubenswrapper[4743]: E1125 16:16:50.006427 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" containerName="init" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.006511 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" containerName="init" Nov 25 16:16:50 crc kubenswrapper[4743]: E1125 16:16:50.006622 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" containerName="dnsmasq-dns" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.006722 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" containerName="dnsmasq-dns" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.007061 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" containerName="dnsmasq-dns" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.007916 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.011314 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.011522 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-lnr9p" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.011697 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.052009 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.077056 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.077108 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.147860 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b65064de-e088-4c89-9767-db14019b6e44-openstack-config\") pod \"openstackclient\" (UID: \"b65064de-e088-4c89-9767-db14019b6e44\") " pod="openstack/openstackclient" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.148316 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65064de-e088-4c89-9767-db14019b6e44-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b65064de-e088-4c89-9767-db14019b6e44\") " pod="openstack/openstackclient" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.148455 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b65064de-e088-4c89-9767-db14019b6e44-openstack-config-secret\") pod \"openstackclient\" (UID: \"b65064de-e088-4c89-9767-db14019b6e44\") " pod="openstack/openstackclient" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.148515 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds6mn\" (UniqueName: \"kubernetes.io/projected/b65064de-e088-4c89-9767-db14019b6e44-kube-api-access-ds6mn\") pod \"openstackclient\" (UID: \"b65064de-e088-4c89-9767-db14019b6e44\") " pod="openstack/openstackclient" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.251633 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65064de-e088-4c89-9767-db14019b6e44-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b65064de-e088-4c89-9767-db14019b6e44\") " pod="openstack/openstackclient" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.252027 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b65064de-e088-4c89-9767-db14019b6e44-openstack-config-secret\") pod \"openstackclient\" (UID: \"b65064de-e088-4c89-9767-db14019b6e44\") " pod="openstack/openstackclient" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.252885 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds6mn\" (UniqueName: \"kubernetes.io/projected/b65064de-e088-4c89-9767-db14019b6e44-kube-api-access-ds6mn\") pod \"openstackclient\" (UID: \"b65064de-e088-4c89-9767-db14019b6e44\") " pod="openstack/openstackclient" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.253043 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b65064de-e088-4c89-9767-db14019b6e44-openstack-config\") pod \"openstackclient\" (UID: \"b65064de-e088-4c89-9767-db14019b6e44\") " pod="openstack/openstackclient" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.256904 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b65064de-e088-4c89-9767-db14019b6e44-openstack-config\") pod \"openstackclient\" (UID: \"b65064de-e088-4c89-9767-db14019b6e44\") " pod="openstack/openstackclient" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.257879 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65064de-e088-4c89-9767-db14019b6e44-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b65064de-e088-4c89-9767-db14019b6e44\") " pod="openstack/openstackclient" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.266349 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b65064de-e088-4c89-9767-db14019b6e44-openstack-config-secret\") pod \"openstackclient\" (UID: \"b65064de-e088-4c89-9767-db14019b6e44\") " pod="openstack/openstackclient" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.275527 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds6mn\" (UniqueName: \"kubernetes.io/projected/b65064de-e088-4c89-9767-db14019b6e44-kube-api-access-ds6mn\") pod \"openstackclient\" (UID: \"b65064de-e088-4c89-9767-db14019b6e44\") " pod="openstack/openstackclient" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.344335 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.421347 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.562959 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmm57\" (UniqueName: \"kubernetes.io/projected/5cc321eb-76fb-487a-bc5a-84af0d390efa-kube-api-access-nmm57\") pod \"5cc321eb-76fb-487a-bc5a-84af0d390efa\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.563013 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-scripts\") pod \"5cc321eb-76fb-487a-bc5a-84af0d390efa\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.563082 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-config-data-custom\") pod \"5cc321eb-76fb-487a-bc5a-84af0d390efa\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.563133 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-combined-ca-bundle\") pod \"5cc321eb-76fb-487a-bc5a-84af0d390efa\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.563257 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cc321eb-76fb-487a-bc5a-84af0d390efa-etc-machine-id\") pod \"5cc321eb-76fb-487a-bc5a-84af0d390efa\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.563441 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-config-data\") pod \"5cc321eb-76fb-487a-bc5a-84af0d390efa\" (UID: \"5cc321eb-76fb-487a-bc5a-84af0d390efa\") " Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.565962 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5cc321eb-76fb-487a-bc5a-84af0d390efa-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5cc321eb-76fb-487a-bc5a-84af0d390efa" (UID: "5cc321eb-76fb-487a-bc5a-84af0d390efa"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.574647 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-scripts" (OuterVolumeSpecName: "scripts") pod "5cc321eb-76fb-487a-bc5a-84af0d390efa" (UID: "5cc321eb-76fb-487a-bc5a-84af0d390efa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.577390 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5cc321eb-76fb-487a-bc5a-84af0d390efa" (UID: "5cc321eb-76fb-487a-bc5a-84af0d390efa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.588659 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cc321eb-76fb-487a-bc5a-84af0d390efa-kube-api-access-nmm57" (OuterVolumeSpecName: "kube-api-access-nmm57") pod "5cc321eb-76fb-487a-bc5a-84af0d390efa" (UID: "5cc321eb-76fb-487a-bc5a-84af0d390efa"). InnerVolumeSpecName "kube-api-access-nmm57". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.666142 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmm57\" (UniqueName: \"kubernetes.io/projected/5cc321eb-76fb-487a-bc5a-84af0d390efa-kube-api-access-nmm57\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.666566 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.666576 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.666584 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cc321eb-76fb-487a-bc5a-84af0d390efa-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.677721 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cc321eb-76fb-487a-bc5a-84af0d390efa" (UID: "5cc321eb-76fb-487a-bc5a-84af0d390efa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.707185 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-config-data" (OuterVolumeSpecName: "config-data") pod "5cc321eb-76fb-487a-bc5a-84af0d390efa" (UID: "5cc321eb-76fb-487a-bc5a-84af0d390efa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.760161 4743 generic.go:334] "Generic (PLEG): container finished" podID="5cc321eb-76fb-487a-bc5a-84af0d390efa" containerID="21c8691b2810c3da3a2e204493e9391ffdfd104ec8a9fe2590de23bc57233c8e" exitCode=0 Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.760205 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5cc321eb-76fb-487a-bc5a-84af0d390efa","Type":"ContainerDied","Data":"21c8691b2810c3da3a2e204493e9391ffdfd104ec8a9fe2590de23bc57233c8e"} Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.760236 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5cc321eb-76fb-487a-bc5a-84af0d390efa","Type":"ContainerDied","Data":"0042bc0146cb55883cca305b6c791f2916469fb8acf76740157bc7fee3f8a48c"} Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.760256 4743 scope.go:117] "RemoveContainer" containerID="5d47d5f3333944896e693e1b77259b278a2003abcc86b6ba64d87ab230e708ce" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.760291 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.769003 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.769038 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc321eb-76fb-487a-bc5a-84af0d390efa-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.787849 4743 scope.go:117] "RemoveContainer" containerID="21c8691b2810c3da3a2e204493e9391ffdfd104ec8a9fe2590de23bc57233c8e" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.806797 4743 scope.go:117] "RemoveContainer" containerID="5d47d5f3333944896e693e1b77259b278a2003abcc86b6ba64d87ab230e708ce" Nov 25 16:16:50 crc kubenswrapper[4743]: E1125 16:16:50.807167 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d47d5f3333944896e693e1b77259b278a2003abcc86b6ba64d87ab230e708ce\": container with ID starting with 5d47d5f3333944896e693e1b77259b278a2003abcc86b6ba64d87ab230e708ce not found: ID does not exist" containerID="5d47d5f3333944896e693e1b77259b278a2003abcc86b6ba64d87ab230e708ce" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.807197 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d47d5f3333944896e693e1b77259b278a2003abcc86b6ba64d87ab230e708ce"} err="failed to get container status \"5d47d5f3333944896e693e1b77259b278a2003abcc86b6ba64d87ab230e708ce\": rpc error: code = NotFound desc = could not find container \"5d47d5f3333944896e693e1b77259b278a2003abcc86b6ba64d87ab230e708ce\": container with ID starting with 5d47d5f3333944896e693e1b77259b278a2003abcc86b6ba64d87ab230e708ce not found: ID does not exist" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.807216 4743 scope.go:117] "RemoveContainer" containerID="21c8691b2810c3da3a2e204493e9391ffdfd104ec8a9fe2590de23bc57233c8e" Nov 25 16:16:50 crc kubenswrapper[4743]: E1125 16:16:50.807517 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21c8691b2810c3da3a2e204493e9391ffdfd104ec8a9fe2590de23bc57233c8e\": container with ID starting with 21c8691b2810c3da3a2e204493e9391ffdfd104ec8a9fe2590de23bc57233c8e not found: ID does not exist" containerID="21c8691b2810c3da3a2e204493e9391ffdfd104ec8a9fe2590de23bc57233c8e" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.807545 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c8691b2810c3da3a2e204493e9391ffdfd104ec8a9fe2590de23bc57233c8e"} err="failed to get container status \"21c8691b2810c3da3a2e204493e9391ffdfd104ec8a9fe2590de23bc57233c8e\": rpc error: code = NotFound desc = could not find container \"21c8691b2810c3da3a2e204493e9391ffdfd104ec8a9fe2590de23bc57233c8e\": container with ID starting with 21c8691b2810c3da3a2e204493e9391ffdfd104ec8a9fe2590de23bc57233c8e not found: ID does not exist" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.896355 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.907613 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.914463 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.939580 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 16:16:50 crc kubenswrapper[4743]: E1125 16:16:50.939985 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cc321eb-76fb-487a-bc5a-84af0d390efa" containerName="probe" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.940003 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc321eb-76fb-487a-bc5a-84af0d390efa" containerName="probe" Nov 25 16:16:50 crc kubenswrapper[4743]: E1125 16:16:50.940041 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cc321eb-76fb-487a-bc5a-84af0d390efa" containerName="cinder-scheduler" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.940048 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc321eb-76fb-487a-bc5a-84af0d390efa" containerName="cinder-scheduler" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.940219 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cc321eb-76fb-487a-bc5a-84af0d390efa" containerName="probe" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.940243 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cc321eb-76fb-487a-bc5a-84af0d390efa" containerName="cinder-scheduler" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.941211 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.943198 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 25 16:16:50 crc kubenswrapper[4743]: I1125 16:16:50.952339 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.072949 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0fd119f0-4e29-4050-baee-a0261c883787-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.072991 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fd119f0-4e29-4050-baee-a0261c883787-scripts\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.073046 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd119f0-4e29-4050-baee-a0261c883787-config-data\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.073073 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd119f0-4e29-4050-baee-a0261c883787-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.073126 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc6jp\" (UniqueName: \"kubernetes.io/projected/0fd119f0-4e29-4050-baee-a0261c883787-kube-api-access-lc6jp\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.073186 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0fd119f0-4e29-4050-baee-a0261c883787-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.174436 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0fd119f0-4e29-4050-baee-a0261c883787-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.174471 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fd119f0-4e29-4050-baee-a0261c883787-scripts\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.174515 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd119f0-4e29-4050-baee-a0261c883787-config-data\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.174541 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd119f0-4e29-4050-baee-a0261c883787-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.174604 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc6jp\" (UniqueName: \"kubernetes.io/projected/0fd119f0-4e29-4050-baee-a0261c883787-kube-api-access-lc6jp\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.174639 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0fd119f0-4e29-4050-baee-a0261c883787-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.174702 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0fd119f0-4e29-4050-baee-a0261c883787-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.179861 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd119f0-4e29-4050-baee-a0261c883787-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.180257 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd119f0-4e29-4050-baee-a0261c883787-config-data\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.182557 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fd119f0-4e29-4050-baee-a0261c883787-scripts\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.183099 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0fd119f0-4e29-4050-baee-a0261c883787-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.193107 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc6jp\" (UniqueName: \"kubernetes.io/projected/0fd119f0-4e29-4050-baee-a0261c883787-kube-api-access-lc6jp\") pod \"cinder-scheduler-0\" (UID: \"0fd119f0-4e29-4050-baee-a0261c883787\") " pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.271060 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.431745 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-8r569" podUID="07384a4d-5f9c-4902-a4e8-4be4d7e4b8a0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.152:5353: i/o timeout" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.743853 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 16:16:51 crc kubenswrapper[4743]: W1125 16:16:51.753395 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fd119f0_4e29_4050_baee_a0261c883787.slice/crio-8a5836420dde97b4d2a2d57e54c0ca2f30dabf3b1929f043f810f6240f2a9a46 WatchSource:0}: Error finding container 8a5836420dde97b4d2a2d57e54c0ca2f30dabf3b1929f043f810f6240f2a9a46: Status 404 returned error can't find the container with id 8a5836420dde97b4d2a2d57e54c0ca2f30dabf3b1929f043f810f6240f2a9a46 Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.773801 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0fd119f0-4e29-4050-baee-a0261c883787","Type":"ContainerStarted","Data":"8a5836420dde97b4d2a2d57e54c0ca2f30dabf3b1929f043f810f6240f2a9a46"} Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.797851 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cc321eb-76fb-487a-bc5a-84af0d390efa" path="/var/lib/kubelet/pods/5cc321eb-76fb-487a-bc5a-84af0d390efa/volumes" Nov 25 16:16:51 crc kubenswrapper[4743]: I1125 16:16:51.812547 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b65064de-e088-4c89-9767-db14019b6e44","Type":"ContainerStarted","Data":"a1c8de0927d08290872040e1f3f77a5f35e67ffa2b5ca212be540b79b64bea37"} Nov 25 16:16:52 crc kubenswrapper[4743]: I1125 16:16:52.412615 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:52 crc kubenswrapper[4743]: I1125 16:16:52.462314 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85dc5d687d-qkdzh" Nov 25 16:16:52 crc kubenswrapper[4743]: I1125 16:16:52.549983 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7775b654cd-srptg"] Nov 25 16:16:52 crc kubenswrapper[4743]: I1125 16:16:52.550256 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7775b654cd-srptg" podUID="42c67ca8-13cb-492f-b37a-6034a8b4f18b" containerName="barbican-api-log" containerID="cri-o://98009a63f4539c6366807c5466c22f208c48db044be18b5f8559eddc1faf9d72" gracePeriod=30 Nov 25 16:16:52 crc kubenswrapper[4743]: I1125 16:16:52.550421 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7775b654cd-srptg" podUID="42c67ca8-13cb-492f-b37a-6034a8b4f18b" containerName="barbican-api" containerID="cri-o://e7b0c5288d719c79d924fb9a41486b1572517d2137bad15b507a0cdbe4f3a866" gracePeriod=30 Nov 25 16:16:52 crc kubenswrapper[4743]: I1125 16:16:52.830372 4743 generic.go:334] "Generic (PLEG): container finished" podID="42c67ca8-13cb-492f-b37a-6034a8b4f18b" containerID="98009a63f4539c6366807c5466c22f208c48db044be18b5f8559eddc1faf9d72" exitCode=143 Nov 25 16:16:52 crc kubenswrapper[4743]: I1125 16:16:52.830448 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7775b654cd-srptg" event={"ID":"42c67ca8-13cb-492f-b37a-6034a8b4f18b","Type":"ContainerDied","Data":"98009a63f4539c6366807c5466c22f208c48db044be18b5f8559eddc1faf9d72"} Nov 25 16:16:52 crc kubenswrapper[4743]: I1125 16:16:52.832449 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0fd119f0-4e29-4050-baee-a0261c883787","Type":"ContainerStarted","Data":"7a7b5898bc300834b34b2c82939770813516a7cd6549eb483487994348997e60"} Nov 25 16:16:53 crc kubenswrapper[4743]: I1125 16:16:53.852797 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0fd119f0-4e29-4050-baee-a0261c883787","Type":"ContainerStarted","Data":"0e552939c09ddb1257e132964c507a640ebc697f87b63f2f24034250ec99d687"} Nov 25 16:16:53 crc kubenswrapper[4743]: I1125 16:16:53.895026 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.895004429 podStartE2EDuration="3.895004429s" podCreationTimestamp="2025-11-25 16:16:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:53.889495096 +0000 UTC m=+1093.011334665" watchObservedRunningTime="2025-11-25 16:16:53.895004429 +0000 UTC m=+1093.016843978" Nov 25 16:16:54 crc kubenswrapper[4743]: I1125 16:16:54.645174 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.338047 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-c64568bc5-svsgq"] Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.340057 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.345142 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.345239 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.351262 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.380014 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-c64568bc5-svsgq"] Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.479462 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd562da8-2d36-4517-8d73-237580575e98-etc-swift\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.480097 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd562da8-2d36-4517-8d73-237580575e98-combined-ca-bundle\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.480156 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd562da8-2d36-4517-8d73-237580575e98-config-data\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.480324 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28gbz\" (UniqueName: \"kubernetes.io/projected/fd562da8-2d36-4517-8d73-237580575e98-kube-api-access-28gbz\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.480400 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd562da8-2d36-4517-8d73-237580575e98-internal-tls-certs\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.480455 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd562da8-2d36-4517-8d73-237580575e98-log-httpd\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.480511 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd562da8-2d36-4517-8d73-237580575e98-public-tls-certs\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.480560 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd562da8-2d36-4517-8d73-237580575e98-run-httpd\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.583344 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd562da8-2d36-4517-8d73-237580575e98-etc-swift\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.583454 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd562da8-2d36-4517-8d73-237580575e98-combined-ca-bundle\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.583485 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd562da8-2d36-4517-8d73-237580575e98-config-data\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.583621 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28gbz\" (UniqueName: \"kubernetes.io/projected/fd562da8-2d36-4517-8d73-237580575e98-kube-api-access-28gbz\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.583695 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd562da8-2d36-4517-8d73-237580575e98-internal-tls-certs\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.583745 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd562da8-2d36-4517-8d73-237580575e98-log-httpd\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.583802 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd562da8-2d36-4517-8d73-237580575e98-public-tls-certs\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.583851 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd562da8-2d36-4517-8d73-237580575e98-run-httpd\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.585081 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd562da8-2d36-4517-8d73-237580575e98-run-httpd\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.585213 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fd562da8-2d36-4517-8d73-237580575e98-log-httpd\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.594966 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd562da8-2d36-4517-8d73-237580575e98-config-data\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.596969 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd562da8-2d36-4517-8d73-237580575e98-internal-tls-certs\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.598221 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd562da8-2d36-4517-8d73-237580575e98-public-tls-certs\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.598370 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd562da8-2d36-4517-8d73-237580575e98-etc-swift\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.599282 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd562da8-2d36-4517-8d73-237580575e98-combined-ca-bundle\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.606038 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28gbz\" (UniqueName: \"kubernetes.io/projected/fd562da8-2d36-4517-8d73-237580575e98-kube-api-access-28gbz\") pod \"swift-proxy-c64568bc5-svsgq\" (UID: \"fd562da8-2d36-4517-8d73-237580575e98\") " pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.664440 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.749104 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7775b654cd-srptg" podUID="42c67ca8-13cb-492f-b37a-6034a8b4f18b" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:51526->10.217.0.161:9311: read: connection reset by peer" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.749113 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7775b654cd-srptg" podUID="42c67ca8-13cb-492f-b37a-6034a8b4f18b" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:51528->10.217.0.161:9311: read: connection reset by peer" Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.892044 4743 generic.go:334] "Generic (PLEG): container finished" podID="42c67ca8-13cb-492f-b37a-6034a8b4f18b" containerID="e7b0c5288d719c79d924fb9a41486b1572517d2137bad15b507a0cdbe4f3a866" exitCode=0 Nov 25 16:16:55 crc kubenswrapper[4743]: I1125 16:16:55.892457 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7775b654cd-srptg" event={"ID":"42c67ca8-13cb-492f-b37a-6034a8b4f18b","Type":"ContainerDied","Data":"e7b0c5288d719c79d924fb9a41486b1572517d2137bad15b507a0cdbe4f3a866"} Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.126504 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hbmzx"] Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.127846 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hbmzx" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.136075 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.145270 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hbmzx"] Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.201434 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c67ca8-13cb-492f-b37a-6034a8b4f18b-logs\") pod \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.201503 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-combined-ca-bundle\") pod \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.201572 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-config-data\") pod \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.201604 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsfzg\" (UniqueName: \"kubernetes.io/projected/42c67ca8-13cb-492f-b37a-6034a8b4f18b-kube-api-access-hsfzg\") pod \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.201697 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-config-data-custom\") pod \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\" (UID: \"42c67ca8-13cb-492f-b37a-6034a8b4f18b\") " Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.201958 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c-operator-scripts\") pod \"nova-api-db-create-hbmzx\" (UID: \"8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c\") " pod="openstack/nova-api-db-create-hbmzx" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.201987 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjcf8\" (UniqueName: \"kubernetes.io/projected/8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c-kube-api-access-sjcf8\") pod \"nova-api-db-create-hbmzx\" (UID: \"8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c\") " pod="openstack/nova-api-db-create-hbmzx" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.204921 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c67ca8-13cb-492f-b37a-6034a8b4f18b-logs" (OuterVolumeSpecName: "logs") pod "42c67ca8-13cb-492f-b37a-6034a8b4f18b" (UID: "42c67ca8-13cb-492f-b37a-6034a8b4f18b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.211742 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "42c67ca8-13cb-492f-b37a-6034a8b4f18b" (UID: "42c67ca8-13cb-492f-b37a-6034a8b4f18b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.224386 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-rz82t"] Nov 25 16:16:56 crc kubenswrapper[4743]: E1125 16:16:56.224851 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c67ca8-13cb-492f-b37a-6034a8b4f18b" containerName="barbican-api" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.224866 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c67ca8-13cb-492f-b37a-6034a8b4f18b" containerName="barbican-api" Nov 25 16:16:56 crc kubenswrapper[4743]: E1125 16:16:56.224887 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c67ca8-13cb-492f-b37a-6034a8b4f18b" containerName="barbican-api-log" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.224893 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c67ca8-13cb-492f-b37a-6034a8b4f18b" containerName="barbican-api-log" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.225256 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c67ca8-13cb-492f-b37a-6034a8b4f18b" containerName="barbican-api" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.225281 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c67ca8-13cb-492f-b37a-6034a8b4f18b" containerName="barbican-api-log" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.228624 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rz82t" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.229791 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c67ca8-13cb-492f-b37a-6034a8b4f18b-kube-api-access-hsfzg" (OuterVolumeSpecName: "kube-api-access-hsfzg") pod "42c67ca8-13cb-492f-b37a-6034a8b4f18b" (UID: "42c67ca8-13cb-492f-b37a-6034a8b4f18b"). InnerVolumeSpecName "kube-api-access-hsfzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.239199 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rz82t"] Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.255416 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42c67ca8-13cb-492f-b37a-6034a8b4f18b" (UID: "42c67ca8-13cb-492f-b37a-6034a8b4f18b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.271677 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.303963 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c-operator-scripts\") pod \"nova-api-db-create-hbmzx\" (UID: \"8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c\") " pod="openstack/nova-api-db-create-hbmzx" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.304033 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjcf8\" (UniqueName: \"kubernetes.io/projected/8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c-kube-api-access-sjcf8\") pod \"nova-api-db-create-hbmzx\" (UID: \"8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c\") " pod="openstack/nova-api-db-create-hbmzx" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.304083 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrh59\" (UniqueName: \"kubernetes.io/projected/03a19749-2d18-460c-af7b-e3539fde228c-kube-api-access-xrh59\") pod \"nova-cell0-db-create-rz82t\" (UID: \"03a19749-2d18-460c-af7b-e3539fde228c\") " pod="openstack/nova-cell0-db-create-rz82t" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.304238 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03a19749-2d18-460c-af7b-e3539fde228c-operator-scripts\") pod \"nova-cell0-db-create-rz82t\" (UID: \"03a19749-2d18-460c-af7b-e3539fde228c\") " pod="openstack/nova-cell0-db-create-rz82t" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.304351 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsfzg\" (UniqueName: \"kubernetes.io/projected/42c67ca8-13cb-492f-b37a-6034a8b4f18b-kube-api-access-hsfzg\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.304387 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.304402 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c67ca8-13cb-492f-b37a-6034a8b4f18b-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.304415 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.305550 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c-operator-scripts\") pod \"nova-api-db-create-hbmzx\" (UID: \"8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c\") " pod="openstack/nova-api-db-create-hbmzx" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.311665 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-config-data" (OuterVolumeSpecName: "config-data") pod "42c67ca8-13cb-492f-b37a-6034a8b4f18b" (UID: "42c67ca8-13cb-492f-b37a-6034a8b4f18b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.325949 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjcf8\" (UniqueName: \"kubernetes.io/projected/8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c-kube-api-access-sjcf8\") pod \"nova-api-db-create-hbmzx\" (UID: \"8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c\") " pod="openstack/nova-api-db-create-hbmzx" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.336150 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-b457-account-create-pwttw"] Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.338058 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b457-account-create-pwttw" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.351037 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.367993 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-hn4tp"] Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.369162 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hn4tp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.384919 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hn4tp"] Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.404642 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b457-account-create-pwttw"] Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.405634 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrh59\" (UniqueName: \"kubernetes.io/projected/03a19749-2d18-460c-af7b-e3539fde228c-kube-api-access-xrh59\") pod \"nova-cell0-db-create-rz82t\" (UID: \"03a19749-2d18-460c-af7b-e3539fde228c\") " pod="openstack/nova-cell0-db-create-rz82t" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.405683 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w96ks\" (UniqueName: \"kubernetes.io/projected/814cd753-a1ab-45d6-9eb8-239b998f43ac-kube-api-access-w96ks\") pod \"nova-api-b457-account-create-pwttw\" (UID: \"814cd753-a1ab-45d6-9eb8-239b998f43ac\") " pod="openstack/nova-api-b457-account-create-pwttw" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.405726 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/814cd753-a1ab-45d6-9eb8-239b998f43ac-operator-scripts\") pod \"nova-api-b457-account-create-pwttw\" (UID: \"814cd753-a1ab-45d6-9eb8-239b998f43ac\") " pod="openstack/nova-api-b457-account-create-pwttw" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.405767 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03a19749-2d18-460c-af7b-e3539fde228c-operator-scripts\") pod \"nova-cell0-db-create-rz82t\" (UID: \"03a19749-2d18-460c-af7b-e3539fde228c\") " pod="openstack/nova-cell0-db-create-rz82t" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.405825 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c67ca8-13cb-492f-b37a-6034a8b4f18b-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.406510 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03a19749-2d18-460c-af7b-e3539fde228c-operator-scripts\") pod \"nova-cell0-db-create-rz82t\" (UID: \"03a19749-2d18-460c-af7b-e3539fde228c\") " pod="openstack/nova-cell0-db-create-rz82t" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.453253 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrh59\" (UniqueName: \"kubernetes.io/projected/03a19749-2d18-460c-af7b-e3539fde228c-kube-api-access-xrh59\") pod \"nova-cell0-db-create-rz82t\" (UID: \"03a19749-2d18-460c-af7b-e3539fde228c\") " pod="openstack/nova-cell0-db-create-rz82t" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.455709 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hbmzx" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.505635 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-c64568bc5-svsgq"] Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.506822 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w96ks\" (UniqueName: \"kubernetes.io/projected/814cd753-a1ab-45d6-9eb8-239b998f43ac-kube-api-access-w96ks\") pod \"nova-api-b457-account-create-pwttw\" (UID: \"814cd753-a1ab-45d6-9eb8-239b998f43ac\") " pod="openstack/nova-api-b457-account-create-pwttw" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.506937 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/814cd753-a1ab-45d6-9eb8-239b998f43ac-operator-scripts\") pod \"nova-api-b457-account-create-pwttw\" (UID: \"814cd753-a1ab-45d6-9eb8-239b998f43ac\") " pod="openstack/nova-api-b457-account-create-pwttw" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.507092 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzsz9\" (UniqueName: \"kubernetes.io/projected/8a30cfb3-c372-4a9a-a444-573a26493643-kube-api-access-lzsz9\") pod \"nova-cell1-db-create-hn4tp\" (UID: \"8a30cfb3-c372-4a9a-a444-573a26493643\") " pod="openstack/nova-cell1-db-create-hn4tp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.507207 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a30cfb3-c372-4a9a-a444-573a26493643-operator-scripts\") pod \"nova-cell1-db-create-hn4tp\" (UID: \"8a30cfb3-c372-4a9a-a444-573a26493643\") " pod="openstack/nova-cell1-db-create-hn4tp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.508195 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/814cd753-a1ab-45d6-9eb8-239b998f43ac-operator-scripts\") pod \"nova-api-b457-account-create-pwttw\" (UID: \"814cd753-a1ab-45d6-9eb8-239b998f43ac\") " pod="openstack/nova-api-b457-account-create-pwttw" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.553802 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w96ks\" (UniqueName: \"kubernetes.io/projected/814cd753-a1ab-45d6-9eb8-239b998f43ac-kube-api-access-w96ks\") pod \"nova-api-b457-account-create-pwttw\" (UID: \"814cd753-a1ab-45d6-9eb8-239b998f43ac\") " pod="openstack/nova-api-b457-account-create-pwttw" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.554216 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rz82t" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.567276 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-c3b4-account-create-vhtfp"] Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.568400 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c3b4-account-create-vhtfp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.580293 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.609074 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d16b76e-dc2a-456a-aaed-79f8338caaa9-operator-scripts\") pod \"nova-cell0-c3b4-account-create-vhtfp\" (UID: \"9d16b76e-dc2a-456a-aaed-79f8338caaa9\") " pod="openstack/nova-cell0-c3b4-account-create-vhtfp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.609146 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5dhw\" (UniqueName: \"kubernetes.io/projected/9d16b76e-dc2a-456a-aaed-79f8338caaa9-kube-api-access-l5dhw\") pod \"nova-cell0-c3b4-account-create-vhtfp\" (UID: \"9d16b76e-dc2a-456a-aaed-79f8338caaa9\") " pod="openstack/nova-cell0-c3b4-account-create-vhtfp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.609228 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzsz9\" (UniqueName: \"kubernetes.io/projected/8a30cfb3-c372-4a9a-a444-573a26493643-kube-api-access-lzsz9\") pod \"nova-cell1-db-create-hn4tp\" (UID: \"8a30cfb3-c372-4a9a-a444-573a26493643\") " pod="openstack/nova-cell1-db-create-hn4tp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.609262 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a30cfb3-c372-4a9a-a444-573a26493643-operator-scripts\") pod \"nova-cell1-db-create-hn4tp\" (UID: \"8a30cfb3-c372-4a9a-a444-573a26493643\") " pod="openstack/nova-cell1-db-create-hn4tp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.609929 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a30cfb3-c372-4a9a-a444-573a26493643-operator-scripts\") pod \"nova-cell1-db-create-hn4tp\" (UID: \"8a30cfb3-c372-4a9a-a444-573a26493643\") " pod="openstack/nova-cell1-db-create-hn4tp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.633093 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c3b4-account-create-vhtfp"] Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.651400 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzsz9\" (UniqueName: \"kubernetes.io/projected/8a30cfb3-c372-4a9a-a444-573a26493643-kube-api-access-lzsz9\") pod \"nova-cell1-db-create-hn4tp\" (UID: \"8a30cfb3-c372-4a9a-a444-573a26493643\") " pod="openstack/nova-cell1-db-create-hn4tp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.701495 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hn4tp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.701533 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b457-account-create-pwttw" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.713750 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d16b76e-dc2a-456a-aaed-79f8338caaa9-operator-scripts\") pod \"nova-cell0-c3b4-account-create-vhtfp\" (UID: \"9d16b76e-dc2a-456a-aaed-79f8338caaa9\") " pod="openstack/nova-cell0-c3b4-account-create-vhtfp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.713828 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5dhw\" (UniqueName: \"kubernetes.io/projected/9d16b76e-dc2a-456a-aaed-79f8338caaa9-kube-api-access-l5dhw\") pod \"nova-cell0-c3b4-account-create-vhtfp\" (UID: \"9d16b76e-dc2a-456a-aaed-79f8338caaa9\") " pod="openstack/nova-cell0-c3b4-account-create-vhtfp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.715681 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d16b76e-dc2a-456a-aaed-79f8338caaa9-operator-scripts\") pod \"nova-cell0-c3b4-account-create-vhtfp\" (UID: \"9d16b76e-dc2a-456a-aaed-79f8338caaa9\") " pod="openstack/nova-cell0-c3b4-account-create-vhtfp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.753103 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5dhw\" (UniqueName: \"kubernetes.io/projected/9d16b76e-dc2a-456a-aaed-79f8338caaa9-kube-api-access-l5dhw\") pod \"nova-cell0-c3b4-account-create-vhtfp\" (UID: \"9d16b76e-dc2a-456a-aaed-79f8338caaa9\") " pod="openstack/nova-cell0-c3b4-account-create-vhtfp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.866562 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c3b4-account-create-vhtfp" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.866917 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-0cdb-account-create-4lr9x"] Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.869701 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0cdb-account-create-4lr9x" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.876328 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.878316 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0cdb-account-create-4lr9x"] Nov 25 16:16:56 crc kubenswrapper[4743]: I1125 16:16:56.983261 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c64568bc5-svsgq" event={"ID":"fd562da8-2d36-4517-8d73-237580575e98","Type":"ContainerStarted","Data":"076b787c0b816b96b11b2de1444b115d7e0a001f1e981142df8de9fbe9ccf586"} Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.018126 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7775b654cd-srptg" event={"ID":"42c67ca8-13cb-492f-b37a-6034a8b4f18b","Type":"ContainerDied","Data":"69158f234c1d97347c16f3e8b149fd903d10e3ee231885cbb4076396b54595bf"} Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.018176 4743 scope.go:117] "RemoveContainer" containerID="e7b0c5288d719c79d924fb9a41486b1572517d2137bad15b507a0cdbe4f3a866" Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.018339 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7775b654cd-srptg" Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.035470 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58dc0431-6633-41cc-9de5-9f18e85f82c1-operator-scripts\") pod \"nova-cell1-0cdb-account-create-4lr9x\" (UID: \"58dc0431-6633-41cc-9de5-9f18e85f82c1\") " pod="openstack/nova-cell1-0cdb-account-create-4lr9x" Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.036185 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp6lw\" (UniqueName: \"kubernetes.io/projected/58dc0431-6633-41cc-9de5-9f18e85f82c1-kube-api-access-tp6lw\") pod \"nova-cell1-0cdb-account-create-4lr9x\" (UID: \"58dc0431-6633-41cc-9de5-9f18e85f82c1\") " pod="openstack/nova-cell1-0cdb-account-create-4lr9x" Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.062922 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7775b654cd-srptg"] Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.073233 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7775b654cd-srptg"] Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.090066 4743 scope.go:117] "RemoveContainer" containerID="98009a63f4539c6366807c5466c22f208c48db044be18b5f8559eddc1faf9d72" Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.138463 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58dc0431-6633-41cc-9de5-9f18e85f82c1-operator-scripts\") pod \"nova-cell1-0cdb-account-create-4lr9x\" (UID: \"58dc0431-6633-41cc-9de5-9f18e85f82c1\") " pod="openstack/nova-cell1-0cdb-account-create-4lr9x" Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.139368 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp6lw\" (UniqueName: \"kubernetes.io/projected/58dc0431-6633-41cc-9de5-9f18e85f82c1-kube-api-access-tp6lw\") pod \"nova-cell1-0cdb-account-create-4lr9x\" (UID: \"58dc0431-6633-41cc-9de5-9f18e85f82c1\") " pod="openstack/nova-cell1-0cdb-account-create-4lr9x" Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.140277 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58dc0431-6633-41cc-9de5-9f18e85f82c1-operator-scripts\") pod \"nova-cell1-0cdb-account-create-4lr9x\" (UID: \"58dc0431-6633-41cc-9de5-9f18e85f82c1\") " pod="openstack/nova-cell1-0cdb-account-create-4lr9x" Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.166356 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp6lw\" (UniqueName: \"kubernetes.io/projected/58dc0431-6633-41cc-9de5-9f18e85f82c1-kube-api-access-tp6lw\") pod \"nova-cell1-0cdb-account-create-4lr9x\" (UID: \"58dc0431-6633-41cc-9de5-9f18e85f82c1\") " pod="openstack/nova-cell1-0cdb-account-create-4lr9x" Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.205696 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0cdb-account-create-4lr9x" Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.225173 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.226160 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="ceilometer-central-agent" containerID="cri-o://53df142b9bba71f1886ea8ec33b4a900678c6db441cc4eb1f79b40dbdb53bc45" gracePeriod=30 Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.226300 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="proxy-httpd" containerID="cri-o://d1513f2957d6f7fa43224017686c56c664bd76898c0dc674dff7f057da23d2af" gracePeriod=30 Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.226347 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="sg-core" containerID="cri-o://80abc756fcd46070ea658aac16e6a33f0e270fa22a756655123278547e25555c" gracePeriod=30 Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.226375 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="ceilometer-notification-agent" containerID="cri-o://46838bad8c1361a4ae8274eea46d0829dfb5b46dcf6c017394bb3e7709383db3" gracePeriod=30 Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.252575 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hbmzx"] Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.260890 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rz82t"] Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.332555 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.156:3000/\": read tcp 10.217.0.2:48346->10.217.0.156:3000: read: connection reset by peer" Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.507280 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hn4tp"] Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.526376 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b457-account-create-pwttw"] Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.726483 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c3b4-account-create-vhtfp"] Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.787052 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c67ca8-13cb-492f-b37a-6034a8b4f18b" path="/var/lib/kubelet/pods/42c67ca8-13cb-492f-b37a-6034a8b4f18b/volumes" Nov 25 16:16:57 crc kubenswrapper[4743]: I1125 16:16:57.796735 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0cdb-account-create-4lr9x"] Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.039671 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c3b4-account-create-vhtfp" event={"ID":"9d16b76e-dc2a-456a-aaed-79f8338caaa9","Type":"ContainerStarted","Data":"0a3a7e3b2f5e3167a04e93cc31787cdfefa291db805cfbdef9af9d478c0407f0"} Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.045360 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b457-account-create-pwttw" event={"ID":"814cd753-a1ab-45d6-9eb8-239b998f43ac","Type":"ContainerStarted","Data":"1e6b54a452860fe274ac0198874e65ab4f6d147a9c2f27082588f01afec3ec02"} Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.051273 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hn4tp" event={"ID":"8a30cfb3-c372-4a9a-a444-573a26493643","Type":"ContainerStarted","Data":"e86dc18eb4690e1250cb2686d9d5228f2483466c6ddfd34e4896469d675fd337"} Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.059393 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-c3b4-account-create-vhtfp" podStartSLOduration=2.059374906 podStartE2EDuration="2.059374906s" podCreationTimestamp="2025-11-25 16:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:58.057062053 +0000 UTC m=+1097.178901602" watchObservedRunningTime="2025-11-25 16:16:58.059374906 +0000 UTC m=+1097.181214455" Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.073244 4743 generic.go:334] "Generic (PLEG): container finished" podID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerID="d1513f2957d6f7fa43224017686c56c664bd76898c0dc674dff7f057da23d2af" exitCode=0 Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.073274 4743 generic.go:334] "Generic (PLEG): container finished" podID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerID="80abc756fcd46070ea658aac16e6a33f0e270fa22a756655123278547e25555c" exitCode=2 Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.073282 4743 generic.go:334] "Generic (PLEG): container finished" podID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerID="53df142b9bba71f1886ea8ec33b4a900678c6db441cc4eb1f79b40dbdb53bc45" exitCode=0 Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.073330 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae733a9b-cace-4e47-8c89-0b1adf03600a","Type":"ContainerDied","Data":"d1513f2957d6f7fa43224017686c56c664bd76898c0dc674dff7f057da23d2af"} Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.073355 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae733a9b-cace-4e47-8c89-0b1adf03600a","Type":"ContainerDied","Data":"80abc756fcd46070ea658aac16e6a33f0e270fa22a756655123278547e25555c"} Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.073364 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae733a9b-cace-4e47-8c89-0b1adf03600a","Type":"ContainerDied","Data":"53df142b9bba71f1886ea8ec33b4a900678c6db441cc4eb1f79b40dbdb53bc45"} Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.081816 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0cdb-account-create-4lr9x" event={"ID":"58dc0431-6633-41cc-9de5-9f18e85f82c1","Type":"ContainerStarted","Data":"ad449fef4a0c760846382c8012fcf727e439823bd536f31c593e551775da6a3f"} Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.085243 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c64568bc5-svsgq" event={"ID":"fd562da8-2d36-4517-8d73-237580575e98","Type":"ContainerStarted","Data":"0fcff65028a41c4a3d576a62bdceaa6bd9f9937cf953cd824dd583a49240ca25"} Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.085291 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c64568bc5-svsgq" event={"ID":"fd562da8-2d36-4517-8d73-237580575e98","Type":"ContainerStarted","Data":"b8bf71f2824c5f31f7bfdb94d69f1ab1b033e5994fa6a395ed64f1ff3ba19794"} Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.085503 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.085542 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.096892 4743 generic.go:334] "Generic (PLEG): container finished" podID="03a19749-2d18-460c-af7b-e3539fde228c" containerID="4f9401c46bd94810787e0917f1f25545dd02e23c768a8676777f641e4ec858f2" exitCode=0 Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.096980 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rz82t" event={"ID":"03a19749-2d18-460c-af7b-e3539fde228c","Type":"ContainerDied","Data":"4f9401c46bd94810787e0917f1f25545dd02e23c768a8676777f641e4ec858f2"} Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.097006 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rz82t" event={"ID":"03a19749-2d18-460c-af7b-e3539fde228c","Type":"ContainerStarted","Data":"4a77052a5b79944ecbf387a4220b0229b1f9ba583a5bab2a2ffc090aa3ee45a7"} Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.104314 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-0cdb-account-create-4lr9x" podStartSLOduration=2.104297656 podStartE2EDuration="2.104297656s" podCreationTimestamp="2025-11-25 16:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:58.097231774 +0000 UTC m=+1097.219071323" watchObservedRunningTime="2025-11-25 16:16:58.104297656 +0000 UTC m=+1097.226137205" Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.105965 4743 generic.go:334] "Generic (PLEG): container finished" podID="8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c" containerID="f4839caa89b2d5d190088fa2bcc56bd460ad24455e645b4aa8df374ccc425705" exitCode=0 Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.106039 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hbmzx" event={"ID":"8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c","Type":"ContainerDied","Data":"f4839caa89b2d5d190088fa2bcc56bd460ad24455e645b4aa8df374ccc425705"} Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.106066 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hbmzx" event={"ID":"8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c","Type":"ContainerStarted","Data":"dda7e24c5e2d3ac847727bc23880ac02a4403e69c7abbf4bd65d72471e83f795"} Nov 25 16:16:58 crc kubenswrapper[4743]: I1125 16:16:58.128341 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-c64568bc5-svsgq" podStartSLOduration=3.12832716 podStartE2EDuration="3.12832716s" podCreationTimestamp="2025-11-25 16:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:16:58.126806983 +0000 UTC m=+1097.248646552" watchObservedRunningTime="2025-11-25 16:16:58.12832716 +0000 UTC m=+1097.250166709" Nov 25 16:16:59 crc kubenswrapper[4743]: I1125 16:16:59.120392 4743 generic.go:334] "Generic (PLEG): container finished" podID="9d16b76e-dc2a-456a-aaed-79f8338caaa9" containerID="52a815fadde519998c3a62340342da1c85661bb301f4810850a3925e08989d77" exitCode=0 Nov 25 16:16:59 crc kubenswrapper[4743]: I1125 16:16:59.120698 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c3b4-account-create-vhtfp" event={"ID":"9d16b76e-dc2a-456a-aaed-79f8338caaa9","Type":"ContainerDied","Data":"52a815fadde519998c3a62340342da1c85661bb301f4810850a3925e08989d77"} Nov 25 16:16:59 crc kubenswrapper[4743]: I1125 16:16:59.122648 4743 generic.go:334] "Generic (PLEG): container finished" podID="58dc0431-6633-41cc-9de5-9f18e85f82c1" containerID="ebf27c6a18f7a9320091c1120b1d5b3676bc26f10881742dfddd3738b3f818a4" exitCode=0 Nov 25 16:16:59 crc kubenswrapper[4743]: I1125 16:16:59.122692 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0cdb-account-create-4lr9x" event={"ID":"58dc0431-6633-41cc-9de5-9f18e85f82c1","Type":"ContainerDied","Data":"ebf27c6a18f7a9320091c1120b1d5b3676bc26f10881742dfddd3738b3f818a4"} Nov 25 16:16:59 crc kubenswrapper[4743]: I1125 16:16:59.124678 4743 generic.go:334] "Generic (PLEG): container finished" podID="814cd753-a1ab-45d6-9eb8-239b998f43ac" containerID="c95a6e8287be81a1fcbfcea5065b5414aca80bba0026c5f9b4fb904c2818834d" exitCode=0 Nov 25 16:16:59 crc kubenswrapper[4743]: I1125 16:16:59.124760 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b457-account-create-pwttw" event={"ID":"814cd753-a1ab-45d6-9eb8-239b998f43ac","Type":"ContainerDied","Data":"c95a6e8287be81a1fcbfcea5065b5414aca80bba0026c5f9b4fb904c2818834d"} Nov 25 16:16:59 crc kubenswrapper[4743]: I1125 16:16:59.126959 4743 generic.go:334] "Generic (PLEG): container finished" podID="8a30cfb3-c372-4a9a-a444-573a26493643" containerID="69e370f1259b514c151bb504c5ed40e2e0b3dea122ce69771797f09b34dbcde3" exitCode=0 Nov 25 16:16:59 crc kubenswrapper[4743]: I1125 16:16:59.126999 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hn4tp" event={"ID":"8a30cfb3-c372-4a9a-a444-573a26493643","Type":"ContainerDied","Data":"69e370f1259b514c151bb504c5ed40e2e0b3dea122ce69771797f09b34dbcde3"} Nov 25 16:16:59 crc kubenswrapper[4743]: I1125 16:16:59.174245 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66f797f6cb-zd4ck" podUID="65514eee-0e20-40f2-b381-21311ae5e899" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 25 16:17:01 crc kubenswrapper[4743]: I1125 16:17:01.147082 4743 generic.go:334] "Generic (PLEG): container finished" podID="a08f6f91-e10d-432d-b8da-acd9f692e6bd" containerID="58187b5c9feb18d00ba0264824147f6af105b5a3d44b7de95c83e9560e5c5d44" exitCode=0 Nov 25 16:17:01 crc kubenswrapper[4743]: I1125 16:17:01.147243 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b9d9d486d-xrstf" event={"ID":"a08f6f91-e10d-432d-b8da-acd9f692e6bd","Type":"ContainerDied","Data":"58187b5c9feb18d00ba0264824147f6af105b5a3d44b7de95c83e9560e5c5d44"} Nov 25 16:17:01 crc kubenswrapper[4743]: I1125 16:17:01.483411 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 25 16:17:01 crc kubenswrapper[4743]: I1125 16:17:01.609845 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.156:3000/\": dial tcp 10.217.0.156:3000: connect: connection refused" Nov 25 16:17:02 crc kubenswrapper[4743]: I1125 16:17:02.166714 4743 generic.go:334] "Generic (PLEG): container finished" podID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerID="46838bad8c1361a4ae8274eea46d0829dfb5b46dcf6c017394bb3e7709383db3" exitCode=0 Nov 25 16:17:02 crc kubenswrapper[4743]: I1125 16:17:02.166743 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae733a9b-cace-4e47-8c89-0b1adf03600a","Type":"ContainerDied","Data":"46838bad8c1361a4ae8274eea46d0829dfb5b46dcf6c017394bb3e7709383db3"} Nov 25 16:17:03 crc kubenswrapper[4743]: I1125 16:17:03.176058 4743 generic.go:334] "Generic (PLEG): container finished" podID="65514eee-0e20-40f2-b381-21311ae5e899" containerID="7f692b888d980122a2a3b6fba9b74f9dbc88be7dfc5a20925b9cd7fd1dea23a9" exitCode=137 Nov 25 16:17:03 crc kubenswrapper[4743]: I1125 16:17:03.176136 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f797f6cb-zd4ck" event={"ID":"65514eee-0e20-40f2-b381-21311ae5e899","Type":"ContainerDied","Data":"7f692b888d980122a2a3b6fba9b74f9dbc88be7dfc5a20925b9cd7fd1dea23a9"} Nov 25 16:17:03 crc kubenswrapper[4743]: I1125 16:17:03.992732 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:17:03 crc kubenswrapper[4743]: I1125 16:17:03.993952 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6dd8557654-lgr92" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.207808 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0cdb-account-create-4lr9x" event={"ID":"58dc0431-6633-41cc-9de5-9f18e85f82c1","Type":"ContainerDied","Data":"ad449fef4a0c760846382c8012fcf727e439823bd536f31c593e551775da6a3f"} Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.207848 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad449fef4a0c760846382c8012fcf727e439823bd536f31c593e551775da6a3f" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.212606 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b457-account-create-pwttw" event={"ID":"814cd753-a1ab-45d6-9eb8-239b998f43ac","Type":"ContainerDied","Data":"1e6b54a452860fe274ac0198874e65ab4f6d147a9c2f27082588f01afec3ec02"} Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.212867 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e6b54a452860fe274ac0198874e65ab4f6d147a9c2f27082588f01afec3ec02" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.327600 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0cdb-account-create-4lr9x" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.437893 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b457-account-create-pwttw" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.446937 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c3b4-account-create-vhtfp" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.453482 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hbmzx" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.485061 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rz82t" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.492920 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hn4tp" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.503563 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp6lw\" (UniqueName: \"kubernetes.io/projected/58dc0431-6633-41cc-9de5-9f18e85f82c1-kube-api-access-tp6lw\") pod \"58dc0431-6633-41cc-9de5-9f18e85f82c1\" (UID: \"58dc0431-6633-41cc-9de5-9f18e85f82c1\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.503942 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58dc0431-6633-41cc-9de5-9f18e85f82c1-operator-scripts\") pod \"58dc0431-6633-41cc-9de5-9f18e85f82c1\" (UID: \"58dc0431-6633-41cc-9de5-9f18e85f82c1\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.504719 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58dc0431-6633-41cc-9de5-9f18e85f82c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "58dc0431-6633-41cc-9de5-9f18e85f82c1" (UID: "58dc0431-6633-41cc-9de5-9f18e85f82c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.519707 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58dc0431-6633-41cc-9de5-9f18e85f82c1-kube-api-access-tp6lw" (OuterVolumeSpecName: "kube-api-access-tp6lw") pod "58dc0431-6633-41cc-9de5-9f18e85f82c1" (UID: "58dc0431-6633-41cc-9de5-9f18e85f82c1"). InnerVolumeSpecName "kube-api-access-tp6lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.593036 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.606731 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w96ks\" (UniqueName: \"kubernetes.io/projected/814cd753-a1ab-45d6-9eb8-239b998f43ac-kube-api-access-w96ks\") pod \"814cd753-a1ab-45d6-9eb8-239b998f43ac\" (UID: \"814cd753-a1ab-45d6-9eb8-239b998f43ac\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.606780 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a30cfb3-c372-4a9a-a444-573a26493643-operator-scripts\") pod \"8a30cfb3-c372-4a9a-a444-573a26493643\" (UID: \"8a30cfb3-c372-4a9a-a444-573a26493643\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.606810 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d16b76e-dc2a-456a-aaed-79f8338caaa9-operator-scripts\") pod \"9d16b76e-dc2a-456a-aaed-79f8338caaa9\" (UID: \"9d16b76e-dc2a-456a-aaed-79f8338caaa9\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.606829 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjcf8\" (UniqueName: \"kubernetes.io/projected/8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c-kube-api-access-sjcf8\") pod \"8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c\" (UID: \"8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.606892 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c-operator-scripts\") pod \"8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c\" (UID: \"8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.606937 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzsz9\" (UniqueName: \"kubernetes.io/projected/8a30cfb3-c372-4a9a-a444-573a26493643-kube-api-access-lzsz9\") pod \"8a30cfb3-c372-4a9a-a444-573a26493643\" (UID: \"8a30cfb3-c372-4a9a-a444-573a26493643\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.606979 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrh59\" (UniqueName: \"kubernetes.io/projected/03a19749-2d18-460c-af7b-e3539fde228c-kube-api-access-xrh59\") pod \"03a19749-2d18-460c-af7b-e3539fde228c\" (UID: \"03a19749-2d18-460c-af7b-e3539fde228c\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.606993 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/814cd753-a1ab-45d6-9eb8-239b998f43ac-operator-scripts\") pod \"814cd753-a1ab-45d6-9eb8-239b998f43ac\" (UID: \"814cd753-a1ab-45d6-9eb8-239b998f43ac\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.607025 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03a19749-2d18-460c-af7b-e3539fde228c-operator-scripts\") pod \"03a19749-2d18-460c-af7b-e3539fde228c\" (UID: \"03a19749-2d18-460c-af7b-e3539fde228c\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.607114 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5dhw\" (UniqueName: \"kubernetes.io/projected/9d16b76e-dc2a-456a-aaed-79f8338caaa9-kube-api-access-l5dhw\") pod \"9d16b76e-dc2a-456a-aaed-79f8338caaa9\" (UID: \"9d16b76e-dc2a-456a-aaed-79f8338caaa9\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.607478 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58dc0431-6633-41cc-9de5-9f18e85f82c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.607489 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp6lw\" (UniqueName: \"kubernetes.io/projected/58dc0431-6633-41cc-9de5-9f18e85f82c1-kube-api-access-tp6lw\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.607520 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a30cfb3-c372-4a9a-a444-573a26493643-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a30cfb3-c372-4a9a-a444-573a26493643" (UID: "8a30cfb3-c372-4a9a-a444-573a26493643"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.608460 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/814cd753-a1ab-45d6-9eb8-239b998f43ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "814cd753-a1ab-45d6-9eb8-239b998f43ac" (UID: "814cd753-a1ab-45d6-9eb8-239b998f43ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.609144 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c" (UID: "8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.609219 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d16b76e-dc2a-456a-aaed-79f8338caaa9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d16b76e-dc2a-456a-aaed-79f8338caaa9" (UID: "9d16b76e-dc2a-456a-aaed-79f8338caaa9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.610894 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/814cd753-a1ab-45d6-9eb8-239b998f43ac-kube-api-access-w96ks" (OuterVolumeSpecName: "kube-api-access-w96ks") pod "814cd753-a1ab-45d6-9eb8-239b998f43ac" (UID: "814cd753-a1ab-45d6-9eb8-239b998f43ac"). InnerVolumeSpecName "kube-api-access-w96ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.611690 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a30cfb3-c372-4a9a-a444-573a26493643-kube-api-access-lzsz9" (OuterVolumeSpecName: "kube-api-access-lzsz9") pod "8a30cfb3-c372-4a9a-a444-573a26493643" (UID: "8a30cfb3-c372-4a9a-a444-573a26493643"). InnerVolumeSpecName "kube-api-access-lzsz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.613457 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d16b76e-dc2a-456a-aaed-79f8338caaa9-kube-api-access-l5dhw" (OuterVolumeSpecName: "kube-api-access-l5dhw") pod "9d16b76e-dc2a-456a-aaed-79f8338caaa9" (UID: "9d16b76e-dc2a-456a-aaed-79f8338caaa9"). InnerVolumeSpecName "kube-api-access-l5dhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.613627 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03a19749-2d18-460c-af7b-e3539fde228c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03a19749-2d18-460c-af7b-e3539fde228c" (UID: "03a19749-2d18-460c-af7b-e3539fde228c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.615894 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c-kube-api-access-sjcf8" (OuterVolumeSpecName: "kube-api-access-sjcf8") pod "8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c" (UID: "8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c"). InnerVolumeSpecName "kube-api-access-sjcf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.616182 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03a19749-2d18-460c-af7b-e3539fde228c-kube-api-access-xrh59" (OuterVolumeSpecName: "kube-api-access-xrh59") pod "03a19749-2d18-460c-af7b-e3539fde228c" (UID: "03a19749-2d18-460c-af7b-e3539fde228c"). InnerVolumeSpecName "kube-api-access-xrh59". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.629745 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.709119 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65514eee-0e20-40f2-b381-21311ae5e899-scripts\") pod \"65514eee-0e20-40f2-b381-21311ae5e899\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.709214 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-horizon-tls-certs\") pod \"65514eee-0e20-40f2-b381-21311ae5e899\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.709340 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-combined-ca-bundle\") pod \"65514eee-0e20-40f2-b381-21311ae5e899\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.709360 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65514eee-0e20-40f2-b381-21311ae5e899-logs\") pod \"65514eee-0e20-40f2-b381-21311ae5e899\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.709383 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-horizon-secret-key\") pod \"65514eee-0e20-40f2-b381-21311ae5e899\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.709438 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7527\" (UniqueName: \"kubernetes.io/projected/65514eee-0e20-40f2-b381-21311ae5e899-kube-api-access-x7527\") pod \"65514eee-0e20-40f2-b381-21311ae5e899\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.709470 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65514eee-0e20-40f2-b381-21311ae5e899-config-data\") pod \"65514eee-0e20-40f2-b381-21311ae5e899\" (UID: \"65514eee-0e20-40f2-b381-21311ae5e899\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.710216 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/814cd753-a1ab-45d6-9eb8-239b998f43ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.710228 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrh59\" (UniqueName: \"kubernetes.io/projected/03a19749-2d18-460c-af7b-e3539fde228c-kube-api-access-xrh59\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.710239 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03a19749-2d18-460c-af7b-e3539fde228c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.710249 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5dhw\" (UniqueName: \"kubernetes.io/projected/9d16b76e-dc2a-456a-aaed-79f8338caaa9-kube-api-access-l5dhw\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.710258 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w96ks\" (UniqueName: \"kubernetes.io/projected/814cd753-a1ab-45d6-9eb8-239b998f43ac-kube-api-access-w96ks\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.710266 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a30cfb3-c372-4a9a-a444-573a26493643-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.710276 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d16b76e-dc2a-456a-aaed-79f8338caaa9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.710284 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjcf8\" (UniqueName: \"kubernetes.io/projected/8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c-kube-api-access-sjcf8\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.710292 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.710300 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzsz9\" (UniqueName: \"kubernetes.io/projected/8a30cfb3-c372-4a9a-a444-573a26493643-kube-api-access-lzsz9\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.710422 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65514eee-0e20-40f2-b381-21311ae5e899-logs" (OuterVolumeSpecName: "logs") pod "65514eee-0e20-40f2-b381-21311ae5e899" (UID: "65514eee-0e20-40f2-b381-21311ae5e899"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.713789 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "65514eee-0e20-40f2-b381-21311ae5e899" (UID: "65514eee-0e20-40f2-b381-21311ae5e899"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.717748 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65514eee-0e20-40f2-b381-21311ae5e899-kube-api-access-x7527" (OuterVolumeSpecName: "kube-api-access-x7527") pod "65514eee-0e20-40f2-b381-21311ae5e899" (UID: "65514eee-0e20-40f2-b381-21311ae5e899"). InnerVolumeSpecName "kube-api-access-x7527". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.743872 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65514eee-0e20-40f2-b381-21311ae5e899-scripts" (OuterVolumeSpecName: "scripts") pod "65514eee-0e20-40f2-b381-21311ae5e899" (UID: "65514eee-0e20-40f2-b381-21311ae5e899"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.760494 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65514eee-0e20-40f2-b381-21311ae5e899" (UID: "65514eee-0e20-40f2-b381-21311ae5e899"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.768113 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65514eee-0e20-40f2-b381-21311ae5e899-config-data" (OuterVolumeSpecName: "config-data") pod "65514eee-0e20-40f2-b381-21311ae5e899" (UID: "65514eee-0e20-40f2-b381-21311ae5e899"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.771072 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "65514eee-0e20-40f2-b381-21311ae5e899" (UID: "65514eee-0e20-40f2-b381-21311ae5e899"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.812070 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae733a9b-cace-4e47-8c89-0b1adf03600a-run-httpd\") pod \"ae733a9b-cace-4e47-8c89-0b1adf03600a\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.812148 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-sg-core-conf-yaml\") pod \"ae733a9b-cace-4e47-8c89-0b1adf03600a\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.812174 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-scripts\") pod \"ae733a9b-cace-4e47-8c89-0b1adf03600a\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.812256 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92trk\" (UniqueName: \"kubernetes.io/projected/ae733a9b-cace-4e47-8c89-0b1adf03600a-kube-api-access-92trk\") pod \"ae733a9b-cace-4e47-8c89-0b1adf03600a\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.812317 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-combined-ca-bundle\") pod \"ae733a9b-cace-4e47-8c89-0b1adf03600a\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.812342 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae733a9b-cace-4e47-8c89-0b1adf03600a-log-httpd\") pod \"ae733a9b-cace-4e47-8c89-0b1adf03600a\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.812454 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-config-data\") pod \"ae733a9b-cace-4e47-8c89-0b1adf03600a\" (UID: \"ae733a9b-cace-4e47-8c89-0b1adf03600a\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.812524 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae733a9b-cace-4e47-8c89-0b1adf03600a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ae733a9b-cace-4e47-8c89-0b1adf03600a" (UID: "ae733a9b-cace-4e47-8c89-0b1adf03600a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.813113 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65514eee-0e20-40f2-b381-21311ae5e899-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.813136 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae733a9b-cace-4e47-8c89-0b1adf03600a-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.813147 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.813159 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65514eee-0e20-40f2-b381-21311ae5e899-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.813169 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.813180 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/65514eee-0e20-40f2-b381-21311ae5e899-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.813192 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7527\" (UniqueName: \"kubernetes.io/projected/65514eee-0e20-40f2-b381-21311ae5e899-kube-api-access-x7527\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.813202 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65514eee-0e20-40f2-b381-21311ae5e899-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.813405 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.813631 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae733a9b-cace-4e47-8c89-0b1adf03600a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ae733a9b-cace-4e47-8c89-0b1adf03600a" (UID: "ae733a9b-cace-4e47-8c89-0b1adf03600a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.816415 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae733a9b-cace-4e47-8c89-0b1adf03600a-kube-api-access-92trk" (OuterVolumeSpecName: "kube-api-access-92trk") pod "ae733a9b-cace-4e47-8c89-0b1adf03600a" (UID: "ae733a9b-cace-4e47-8c89-0b1adf03600a"). InnerVolumeSpecName "kube-api-access-92trk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.817750 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-scripts" (OuterVolumeSpecName: "scripts") pod "ae733a9b-cace-4e47-8c89-0b1adf03600a" (UID: "ae733a9b-cace-4e47-8c89-0b1adf03600a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.858041 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ae733a9b-cace-4e47-8c89-0b1adf03600a" (UID: "ae733a9b-cace-4e47-8c89-0b1adf03600a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.902427 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae733a9b-cace-4e47-8c89-0b1adf03600a" (UID: "ae733a9b-cace-4e47-8c89-0b1adf03600a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.915458 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-combined-ca-bundle\") pod \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.915546 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-httpd-config\") pod \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.915625 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-ovndb-tls-certs\") pod \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.915675 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvst8\" (UniqueName: \"kubernetes.io/projected/a08f6f91-e10d-432d-b8da-acd9f692e6bd-kube-api-access-jvst8\") pod \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.915706 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-config\") pod \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\" (UID: \"a08f6f91-e10d-432d-b8da-acd9f692e6bd\") " Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.916074 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.916086 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.916095 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92trk\" (UniqueName: \"kubernetes.io/projected/ae733a9b-cace-4e47-8c89-0b1adf03600a-kube-api-access-92trk\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.916104 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.916112 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae733a9b-cace-4e47-8c89-0b1adf03600a-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.934092 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a08f6f91-e10d-432d-b8da-acd9f692e6bd" (UID: "a08f6f91-e10d-432d-b8da-acd9f692e6bd"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.934360 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08f6f91-e10d-432d-b8da-acd9f692e6bd-kube-api-access-jvst8" (OuterVolumeSpecName: "kube-api-access-jvst8") pod "a08f6f91-e10d-432d-b8da-acd9f692e6bd" (UID: "a08f6f91-e10d-432d-b8da-acd9f692e6bd"). InnerVolumeSpecName "kube-api-access-jvst8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.937753 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-config-data" (OuterVolumeSpecName: "config-data") pod "ae733a9b-cace-4e47-8c89-0b1adf03600a" (UID: "ae733a9b-cace-4e47-8c89-0b1adf03600a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.977380 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-config" (OuterVolumeSpecName: "config") pod "a08f6f91-e10d-432d-b8da-acd9f692e6bd" (UID: "a08f6f91-e10d-432d-b8da-acd9f692e6bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:04 crc kubenswrapper[4743]: I1125 16:17:04.989565 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a08f6f91-e10d-432d-b8da-acd9f692e6bd" (UID: "a08f6f91-e10d-432d-b8da-acd9f692e6bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.014779 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a08f6f91-e10d-432d-b8da-acd9f692e6bd" (UID: "a08f6f91-e10d-432d-b8da-acd9f692e6bd"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.017298 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae733a9b-cace-4e47-8c89-0b1adf03600a-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.017328 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.017341 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.017351 4743 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.017359 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvst8\" (UniqueName: \"kubernetes.io/projected/a08f6f91-e10d-432d-b8da-acd9f692e6bd-kube-api-access-jvst8\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.017370 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a08f6f91-e10d-432d-b8da-acd9f692e6bd-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.223803 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hbmzx" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.223804 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hbmzx" event={"ID":"8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c","Type":"ContainerDied","Data":"dda7e24c5e2d3ac847727bc23880ac02a4403e69c7abbf4bd65d72471e83f795"} Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.224045 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dda7e24c5e2d3ac847727bc23880ac02a4403e69c7abbf4bd65d72471e83f795" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.225907 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hn4tp" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.225935 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hn4tp" event={"ID":"8a30cfb3-c372-4a9a-a444-573a26493643","Type":"ContainerDied","Data":"e86dc18eb4690e1250cb2686d9d5228f2483466c6ddfd34e4896469d675fd337"} Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.226008 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e86dc18eb4690e1250cb2686d9d5228f2483466c6ddfd34e4896469d675fd337" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.228277 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c3b4-account-create-vhtfp" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.228305 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c3b4-account-create-vhtfp" event={"ID":"9d16b76e-dc2a-456a-aaed-79f8338caaa9","Type":"ContainerDied","Data":"0a3a7e3b2f5e3167a04e93cc31787cdfefa291db805cfbdef9af9d478c0407f0"} Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.228330 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a3a7e3b2f5e3167a04e93cc31787cdfefa291db805cfbdef9af9d478c0407f0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.230366 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f797f6cb-zd4ck" event={"ID":"65514eee-0e20-40f2-b381-21311ae5e899","Type":"ContainerDied","Data":"e61616598c151dfc5c9a723acc1846c4b18879116a2d6d423af5551cdab8944d"} Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.230414 4743 scope.go:117] "RemoveContainer" containerID="c7efc3bf5ec0a3a745508cfbeedbf2a676ec3389b99cbb20a3ba37972d4920be" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.230542 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66f797f6cb-zd4ck" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.238999 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b65064de-e088-4c89-9767-db14019b6e44","Type":"ContainerStarted","Data":"efd1fa58bd5b4ae88ceaea6e1ea73f5195ab442312ddf63bee050e0df31293d1"} Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.247185 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae733a9b-cace-4e47-8c89-0b1adf03600a","Type":"ContainerDied","Data":"2dc2ac55b579c4a3894e80de00264facc183707761e957ed6ab9464d8ca0e724"} Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.247237 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.249975 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b9d9d486d-xrstf" event={"ID":"a08f6f91-e10d-432d-b8da-acd9f692e6bd","Type":"ContainerDied","Data":"1d527939cef4dbbdfc34ce72b6db2eed1a8f9b58b8e86108b66c7b96caae434b"} Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.250085 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b9d9d486d-xrstf" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.255455 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.860355404 podStartE2EDuration="16.255432593s" podCreationTimestamp="2025-11-25 16:16:49 +0000 UTC" firstStartedPulling="2025-11-25 16:16:50.915494884 +0000 UTC m=+1090.037334433" lastFinishedPulling="2025-11-25 16:17:04.310572083 +0000 UTC m=+1103.432411622" observedRunningTime="2025-11-25 16:17:05.254120812 +0000 UTC m=+1104.375960381" watchObservedRunningTime="2025-11-25 16:17:05.255432593 +0000 UTC m=+1104.377272142" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.260650 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b457-account-create-pwttw" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.261326 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rz82t" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.262183 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rz82t" event={"ID":"03a19749-2d18-460c-af7b-e3539fde228c","Type":"ContainerDied","Data":"4a77052a5b79944ecbf387a4220b0229b1f9ba583a5bab2a2ffc090aa3ee45a7"} Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.262225 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a77052a5b79944ecbf387a4220b0229b1f9ba583a5bab2a2ffc090aa3ee45a7" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.262321 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0cdb-account-create-4lr9x" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.384154 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66f797f6cb-zd4ck"] Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.411839 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-66f797f6cb-zd4ck"] Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.451003 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.459870 4743 scope.go:117] "RemoveContainer" containerID="7f692b888d980122a2a3b6fba9b74f9dbc88be7dfc5a20925b9cd7fd1dea23a9" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.460495 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.473951 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b9d9d486d-xrstf"] Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.484796 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:17:05 crc kubenswrapper[4743]: E1125 16:17:05.485407 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65514eee-0e20-40f2-b381-21311ae5e899" containerName="horizon" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.485519 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="65514eee-0e20-40f2-b381-21311ae5e899" containerName="horizon" Nov 25 16:17:05 crc kubenswrapper[4743]: E1125 16:17:05.485584 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="sg-core" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.485652 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="sg-core" Nov 25 16:17:05 crc kubenswrapper[4743]: E1125 16:17:05.485705 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08f6f91-e10d-432d-b8da-acd9f692e6bd" containerName="neutron-httpd" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.485771 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08f6f91-e10d-432d-b8da-acd9f692e6bd" containerName="neutron-httpd" Nov 25 16:17:05 crc kubenswrapper[4743]: E1125 16:17:05.485836 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814cd753-a1ab-45d6-9eb8-239b998f43ac" containerName="mariadb-account-create" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.485885 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="814cd753-a1ab-45d6-9eb8-239b998f43ac" containerName="mariadb-account-create" Nov 25 16:17:05 crc kubenswrapper[4743]: E1125 16:17:05.485936 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58dc0431-6633-41cc-9de5-9f18e85f82c1" containerName="mariadb-account-create" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.485992 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="58dc0431-6633-41cc-9de5-9f18e85f82c1" containerName="mariadb-account-create" Nov 25 16:17:05 crc kubenswrapper[4743]: E1125 16:17:05.486041 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08f6f91-e10d-432d-b8da-acd9f692e6bd" containerName="neutron-api" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.486093 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08f6f91-e10d-432d-b8da-acd9f692e6bd" containerName="neutron-api" Nov 25 16:17:05 crc kubenswrapper[4743]: E1125 16:17:05.486148 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65514eee-0e20-40f2-b381-21311ae5e899" containerName="horizon-log" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.486196 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="65514eee-0e20-40f2-b381-21311ae5e899" containerName="horizon-log" Nov 25 16:17:05 crc kubenswrapper[4743]: E1125 16:17:05.486247 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="ceilometer-central-agent" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.486297 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="ceilometer-central-agent" Nov 25 16:17:05 crc kubenswrapper[4743]: E1125 16:17:05.486349 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c" containerName="mariadb-database-create" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.486399 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c" containerName="mariadb-database-create" Nov 25 16:17:05 crc kubenswrapper[4743]: E1125 16:17:05.486457 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03a19749-2d18-460c-af7b-e3539fde228c" containerName="mariadb-database-create" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.486520 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="03a19749-2d18-460c-af7b-e3539fde228c" containerName="mariadb-database-create" Nov 25 16:17:05 crc kubenswrapper[4743]: E1125 16:17:05.486609 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a30cfb3-c372-4a9a-a444-573a26493643" containerName="mariadb-database-create" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.486663 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a30cfb3-c372-4a9a-a444-573a26493643" containerName="mariadb-database-create" Nov 25 16:17:05 crc kubenswrapper[4743]: E1125 16:17:05.486723 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="ceilometer-notification-agent" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.486771 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="ceilometer-notification-agent" Nov 25 16:17:05 crc kubenswrapper[4743]: E1125 16:17:05.486824 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d16b76e-dc2a-456a-aaed-79f8338caaa9" containerName="mariadb-account-create" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.486881 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d16b76e-dc2a-456a-aaed-79f8338caaa9" containerName="mariadb-account-create" Nov 25 16:17:05 crc kubenswrapper[4743]: E1125 16:17:05.486937 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="proxy-httpd" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.486988 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="proxy-httpd" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.487220 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="65514eee-0e20-40f2-b381-21311ae5e899" containerName="horizon" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.487289 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08f6f91-e10d-432d-b8da-acd9f692e6bd" containerName="neutron-httpd" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.487344 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d16b76e-dc2a-456a-aaed-79f8338caaa9" containerName="mariadb-account-create" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.487398 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="ceilometer-notification-agent" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.487451 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="sg-core" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.487501 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a30cfb3-c372-4a9a-a444-573a26493643" containerName="mariadb-database-create" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.487557 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="65514eee-0e20-40f2-b381-21311ae5e899" containerName="horizon-log" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.487628 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="58dc0431-6633-41cc-9de5-9f18e85f82c1" containerName="mariadb-account-create" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.487861 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="proxy-httpd" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.488021 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c" containerName="mariadb-database-create" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.488081 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" containerName="ceilometer-central-agent" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.488192 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="814cd753-a1ab-45d6-9eb8-239b998f43ac" containerName="mariadb-account-create" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.488253 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08f6f91-e10d-432d-b8da-acd9f692e6bd" containerName="neutron-api" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.488314 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="03a19749-2d18-460c-af7b-e3539fde228c" containerName="mariadb-database-create" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.490867 4743 scope.go:117] "RemoveContainer" containerID="d1513f2957d6f7fa43224017686c56c664bd76898c0dc674dff7f057da23d2af" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.492088 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.494922 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.495192 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.496292 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b9d9d486d-xrstf"] Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.504219 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.524864 4743 scope.go:117] "RemoveContainer" containerID="80abc756fcd46070ea658aac16e6a33f0e270fa22a756655123278547e25555c" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.547713 4743 scope.go:117] "RemoveContainer" containerID="46838bad8c1361a4ae8274eea46d0829dfb5b46dcf6c017394bb3e7709383db3" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.566885 4743 scope.go:117] "RemoveContainer" containerID="53df142b9bba71f1886ea8ec33b4a900678c6db441cc4eb1f79b40dbdb53bc45" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.584872 4743 scope.go:117] "RemoveContainer" containerID="2f37558a404668862a517b9a4db9d7f9477c4728e9e8ee51cae5b26209ac8de4" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.606289 4743 scope.go:117] "RemoveContainer" containerID="58187b5c9feb18d00ba0264824147f6af105b5a3d44b7de95c83e9560e5c5d44" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.639832 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2gxv\" (UniqueName: \"kubernetes.io/projected/c08dfadc-c16a-4c82-be2d-318ab9aae386-kube-api-access-d2gxv\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.639897 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.639917 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c08dfadc-c16a-4c82-be2d-318ab9aae386-log-httpd\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.640140 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-scripts\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.640249 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-config-data\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.640392 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.640445 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c08dfadc-c16a-4c82-be2d-318ab9aae386-run-httpd\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.669509 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.670062 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-c64568bc5-svsgq" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.742253 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-scripts\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.742704 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-config-data\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.742769 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.742796 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c08dfadc-c16a-4c82-be2d-318ab9aae386-run-httpd\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.742855 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2gxv\" (UniqueName: \"kubernetes.io/projected/c08dfadc-c16a-4c82-be2d-318ab9aae386-kube-api-access-d2gxv\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.742904 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.742924 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c08dfadc-c16a-4c82-be2d-318ab9aae386-log-httpd\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.743472 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c08dfadc-c16a-4c82-be2d-318ab9aae386-log-httpd\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.743507 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c08dfadc-c16a-4c82-be2d-318ab9aae386-run-httpd\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.747500 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-scripts\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.748032 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-config-data\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.748513 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.749371 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.763494 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2gxv\" (UniqueName: \"kubernetes.io/projected/c08dfadc-c16a-4c82-be2d-318ab9aae386-kube-api-access-d2gxv\") pod \"ceilometer-0\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " pod="openstack/ceilometer-0" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.790913 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65514eee-0e20-40f2-b381-21311ae5e899" path="/var/lib/kubelet/pods/65514eee-0e20-40f2-b381-21311ae5e899/volumes" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.791964 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a08f6f91-e10d-432d-b8da-acd9f692e6bd" path="/var/lib/kubelet/pods/a08f6f91-e10d-432d-b8da-acd9f692e6bd/volumes" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.792795 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae733a9b-cace-4e47-8c89-0b1adf03600a" path="/var/lib/kubelet/pods/ae733a9b-cace-4e47-8c89-0b1adf03600a/volumes" Nov 25 16:17:05 crc kubenswrapper[4743]: I1125 16:17:05.820054 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:17:06 crc kubenswrapper[4743]: I1125 16:17:06.252527 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:17:06 crc kubenswrapper[4743]: I1125 16:17:06.269386 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c08dfadc-c16a-4c82-be2d-318ab9aae386","Type":"ContainerStarted","Data":"b8f7aeae6049264369f098a16d7ba5c4987412c4b5d100e40cd6dada30cee52c"} Nov 25 16:17:06 crc kubenswrapper[4743]: I1125 16:17:06.871238 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9r9vw"] Nov 25 16:17:06 crc kubenswrapper[4743]: I1125 16:17:06.872302 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:06 crc kubenswrapper[4743]: I1125 16:17:06.875285 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 25 16:17:06 crc kubenswrapper[4743]: I1125 16:17:06.875407 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5zkjx" Nov 25 16:17:06 crc kubenswrapper[4743]: I1125 16:17:06.875941 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 25 16:17:06 crc kubenswrapper[4743]: I1125 16:17:06.887899 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9r9vw"] Nov 25 16:17:07 crc kubenswrapper[4743]: I1125 16:17:07.064634 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9r9vw\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:07 crc kubenswrapper[4743]: I1125 16:17:07.070737 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-config-data\") pod \"nova-cell0-conductor-db-sync-9r9vw\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:07 crc kubenswrapper[4743]: I1125 16:17:07.071214 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-scripts\") pod \"nova-cell0-conductor-db-sync-9r9vw\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:07 crc kubenswrapper[4743]: I1125 16:17:07.071531 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zngcr\" (UniqueName: \"kubernetes.io/projected/84d49d73-f8be-44ee-a3fc-37612fdb9440-kube-api-access-zngcr\") pod \"nova-cell0-conductor-db-sync-9r9vw\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:07 crc kubenswrapper[4743]: I1125 16:17:07.173843 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-config-data\") pod \"nova-cell0-conductor-db-sync-9r9vw\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:07 crc kubenswrapper[4743]: I1125 16:17:07.174465 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-scripts\") pod \"nova-cell0-conductor-db-sync-9r9vw\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:07 crc kubenswrapper[4743]: I1125 16:17:07.174748 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zngcr\" (UniqueName: \"kubernetes.io/projected/84d49d73-f8be-44ee-a3fc-37612fdb9440-kube-api-access-zngcr\") pod \"nova-cell0-conductor-db-sync-9r9vw\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:07 crc kubenswrapper[4743]: I1125 16:17:07.174869 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9r9vw\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:07 crc kubenswrapper[4743]: I1125 16:17:07.181252 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-scripts\") pod \"nova-cell0-conductor-db-sync-9r9vw\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:07 crc kubenswrapper[4743]: I1125 16:17:07.181711 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-config-data\") pod \"nova-cell0-conductor-db-sync-9r9vw\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:07 crc kubenswrapper[4743]: I1125 16:17:07.186108 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9r9vw\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:07 crc kubenswrapper[4743]: I1125 16:17:07.192869 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zngcr\" (UniqueName: \"kubernetes.io/projected/84d49d73-f8be-44ee-a3fc-37612fdb9440-kube-api-access-zngcr\") pod \"nova-cell0-conductor-db-sync-9r9vw\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:07 crc kubenswrapper[4743]: I1125 16:17:07.203293 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:07 crc kubenswrapper[4743]: I1125 16:17:07.287462 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c08dfadc-c16a-4c82-be2d-318ab9aae386","Type":"ContainerStarted","Data":"cef64f60cca8550dcdc59cb1865d390f2047e35342c7cab795b3c9eaa6ff1f7a"} Nov 25 16:17:07 crc kubenswrapper[4743]: I1125 16:17:07.689835 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9r9vw"] Nov 25 16:17:07 crc kubenswrapper[4743]: W1125 16:17:07.691349 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84d49d73_f8be_44ee_a3fc_37612fdb9440.slice/crio-ac06c25db6ad8755c4af4790f22a2ed11c1202590b1f58bc915f2e66094bf7d3 WatchSource:0}: Error finding container ac06c25db6ad8755c4af4790f22a2ed11c1202590b1f58bc915f2e66094bf7d3: Status 404 returned error can't find the container with id ac06c25db6ad8755c4af4790f22a2ed11c1202590b1f58bc915f2e66094bf7d3 Nov 25 16:17:08 crc kubenswrapper[4743]: I1125 16:17:08.299194 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9r9vw" event={"ID":"84d49d73-f8be-44ee-a3fc-37612fdb9440","Type":"ContainerStarted","Data":"ac06c25db6ad8755c4af4790f22a2ed11c1202590b1f58bc915f2e66094bf7d3"} Nov 25 16:17:08 crc kubenswrapper[4743]: I1125 16:17:08.304247 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c08dfadc-c16a-4c82-be2d-318ab9aae386","Type":"ContainerStarted","Data":"9998c2d80b6bf7df93e64057998fb73f7398879fbbc782dc3bcba76223c9278e"} Nov 25 16:17:09 crc kubenswrapper[4743]: I1125 16:17:09.315722 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c08dfadc-c16a-4c82-be2d-318ab9aae386","Type":"ContainerStarted","Data":"5bdec32ad26d7b6526897b537449d6c1ff4087b3ef59178ff5968dfe2e7c76e7"} Nov 25 16:17:14 crc kubenswrapper[4743]: I1125 16:17:14.424013 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:17:16 crc kubenswrapper[4743]: I1125 16:17:16.388362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9r9vw" event={"ID":"84d49d73-f8be-44ee-a3fc-37612fdb9440","Type":"ContainerStarted","Data":"9ca9adb39873a76fd53f3d453014253693331a45377595054dd03d9f710977f8"} Nov 25 16:17:16 crc kubenswrapper[4743]: I1125 16:17:16.391828 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c08dfadc-c16a-4c82-be2d-318ab9aae386","Type":"ContainerStarted","Data":"08d01dab6fdc4f4476fa863d86caf7a2f0f0b1b9f755b31b7550b7f1ae57ac2a"} Nov 25 16:17:16 crc kubenswrapper[4743]: I1125 16:17:16.391932 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="ceilometer-central-agent" containerID="cri-o://cef64f60cca8550dcdc59cb1865d390f2047e35342c7cab795b3c9eaa6ff1f7a" gracePeriod=30 Nov 25 16:17:16 crc kubenswrapper[4743]: I1125 16:17:16.392004 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 16:17:16 crc kubenswrapper[4743]: I1125 16:17:16.392005 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="ceilometer-notification-agent" containerID="cri-o://9998c2d80b6bf7df93e64057998fb73f7398879fbbc782dc3bcba76223c9278e" gracePeriod=30 Nov 25 16:17:16 crc kubenswrapper[4743]: I1125 16:17:16.392005 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="proxy-httpd" containerID="cri-o://08d01dab6fdc4f4476fa863d86caf7a2f0f0b1b9f755b31b7550b7f1ae57ac2a" gracePeriod=30 Nov 25 16:17:16 crc kubenswrapper[4743]: I1125 16:17:16.392342 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="sg-core" containerID="cri-o://5bdec32ad26d7b6526897b537449d6c1ff4087b3ef59178ff5968dfe2e7c76e7" gracePeriod=30 Nov 25 16:17:16 crc kubenswrapper[4743]: I1125 16:17:16.413863 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-9r9vw" podStartSLOduration=2.008395101 podStartE2EDuration="10.413840998s" podCreationTimestamp="2025-11-25 16:17:06 +0000 UTC" firstStartedPulling="2025-11-25 16:17:07.696541642 +0000 UTC m=+1106.818381181" lastFinishedPulling="2025-11-25 16:17:16.101987529 +0000 UTC m=+1115.223827078" observedRunningTime="2025-11-25 16:17:16.405374403 +0000 UTC m=+1115.527213992" watchObservedRunningTime="2025-11-25 16:17:16.413840998 +0000 UTC m=+1115.535680557" Nov 25 16:17:16 crc kubenswrapper[4743]: I1125 16:17:16.435148 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.330878298 podStartE2EDuration="11.435120446s" podCreationTimestamp="2025-11-25 16:17:05 +0000 UTC" firstStartedPulling="2025-11-25 16:17:06.255251719 +0000 UTC m=+1105.377091268" lastFinishedPulling="2025-11-25 16:17:11.359493877 +0000 UTC m=+1110.481333416" observedRunningTime="2025-11-25 16:17:16.431378578 +0000 UTC m=+1115.553218147" watchObservedRunningTime="2025-11-25 16:17:16.435120446 +0000 UTC m=+1115.556960005" Nov 25 16:17:17 crc kubenswrapper[4743]: I1125 16:17:17.207049 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:17:17 crc kubenswrapper[4743]: I1125 16:17:17.207680 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bca4ab88-574a-473c-aab5-bf977973a9d8" containerName="glance-log" containerID="cri-o://83acb6f7093f91075ad4ae7ba0181889d77738a2fed5ab3b50c357dfcd1d35e4" gracePeriod=30 Nov 25 16:17:17 crc kubenswrapper[4743]: I1125 16:17:17.207786 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bca4ab88-574a-473c-aab5-bf977973a9d8" containerName="glance-httpd" containerID="cri-o://49f7563dc2a3fe63fc0b75f6c8e23719ce775916bc28f8c6f2f4ae6c5a107d03" gracePeriod=30 Nov 25 16:17:17 crc kubenswrapper[4743]: E1125 16:17:17.273484 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbca4ab88_574a_473c_aab5_bf977973a9d8.slice/crio-83acb6f7093f91075ad4ae7ba0181889d77738a2fed5ab3b50c357dfcd1d35e4.scope\": RecentStats: unable to find data in memory cache]" Nov 25 16:17:17 crc kubenswrapper[4743]: I1125 16:17:17.400488 4743 generic.go:334] "Generic (PLEG): container finished" podID="bca4ab88-574a-473c-aab5-bf977973a9d8" containerID="83acb6f7093f91075ad4ae7ba0181889d77738a2fed5ab3b50c357dfcd1d35e4" exitCode=143 Nov 25 16:17:17 crc kubenswrapper[4743]: I1125 16:17:17.400557 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bca4ab88-574a-473c-aab5-bf977973a9d8","Type":"ContainerDied","Data":"83acb6f7093f91075ad4ae7ba0181889d77738a2fed5ab3b50c357dfcd1d35e4"} Nov 25 16:17:17 crc kubenswrapper[4743]: I1125 16:17:17.403559 4743 generic.go:334] "Generic (PLEG): container finished" podID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerID="5bdec32ad26d7b6526897b537449d6c1ff4087b3ef59178ff5968dfe2e7c76e7" exitCode=2 Nov 25 16:17:17 crc kubenswrapper[4743]: I1125 16:17:17.403606 4743 generic.go:334] "Generic (PLEG): container finished" podID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerID="cef64f60cca8550dcdc59cb1865d390f2047e35342c7cab795b3c9eaa6ff1f7a" exitCode=0 Nov 25 16:17:17 crc kubenswrapper[4743]: I1125 16:17:17.404406 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c08dfadc-c16a-4c82-be2d-318ab9aae386","Type":"ContainerDied","Data":"5bdec32ad26d7b6526897b537449d6c1ff4087b3ef59178ff5968dfe2e7c76e7"} Nov 25 16:17:17 crc kubenswrapper[4743]: I1125 16:17:17.404457 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c08dfadc-c16a-4c82-be2d-318ab9aae386","Type":"ContainerDied","Data":"cef64f60cca8550dcdc59cb1865d390f2047e35342c7cab795b3c9eaa6ff1f7a"} Nov 25 16:17:18 crc kubenswrapper[4743]: I1125 16:17:18.422220 4743 generic.go:334] "Generic (PLEG): container finished" podID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerID="9998c2d80b6bf7df93e64057998fb73f7398879fbbc782dc3bcba76223c9278e" exitCode=0 Nov 25 16:17:18 crc kubenswrapper[4743]: I1125 16:17:18.422708 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c08dfadc-c16a-4c82-be2d-318ab9aae386","Type":"ContainerDied","Data":"9998c2d80b6bf7df93e64057998fb73f7398879fbbc782dc3bcba76223c9278e"} Nov 25 16:17:18 crc kubenswrapper[4743]: I1125 16:17:18.431402 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 16:17:18 crc kubenswrapper[4743]: I1125 16:17:18.431650 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f117987a-97b8-4338-a8e2-0e298028faab" containerName="glance-log" containerID="cri-o://3f32e61e0bad49765dd62176b4d938c99b52e3fe309342ba5c89cac80eb5c22b" gracePeriod=30 Nov 25 16:17:18 crc kubenswrapper[4743]: I1125 16:17:18.431831 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f117987a-97b8-4338-a8e2-0e298028faab" containerName="glance-httpd" containerID="cri-o://0b4b31082053d4d89977b5633c59d12364ab42641c5d6ee0f0e787c2d83cabc8" gracePeriod=30 Nov 25 16:17:19 crc kubenswrapper[4743]: I1125 16:17:19.433276 4743 generic.go:334] "Generic (PLEG): container finished" podID="f117987a-97b8-4338-a8e2-0e298028faab" containerID="3f32e61e0bad49765dd62176b4d938c99b52e3fe309342ba5c89cac80eb5c22b" exitCode=143 Nov 25 16:17:19 crc kubenswrapper[4743]: I1125 16:17:19.433366 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f117987a-97b8-4338-a8e2-0e298028faab","Type":"ContainerDied","Data":"3f32e61e0bad49765dd62176b4d938c99b52e3fe309342ba5c89cac80eb5c22b"} Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.077281 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.077341 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.448649 4743 generic.go:334] "Generic (PLEG): container finished" podID="bca4ab88-574a-473c-aab5-bf977973a9d8" containerID="49f7563dc2a3fe63fc0b75f6c8e23719ce775916bc28f8c6f2f4ae6c5a107d03" exitCode=0 Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.448694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bca4ab88-574a-473c-aab5-bf977973a9d8","Type":"ContainerDied","Data":"49f7563dc2a3fe63fc0b75f6c8e23719ce775916bc28f8c6f2f4ae6c5a107d03"} Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.852893 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.951696 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-combined-ca-bundle\") pod \"bca4ab88-574a-473c-aab5-bf977973a9d8\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.953060 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-public-tls-certs\") pod \"bca4ab88-574a-473c-aab5-bf977973a9d8\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.953102 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-config-data\") pod \"bca4ab88-574a-473c-aab5-bf977973a9d8\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.953122 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"bca4ab88-574a-473c-aab5-bf977973a9d8\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.953216 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-scripts\") pod \"bca4ab88-574a-473c-aab5-bf977973a9d8\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.953301 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bca4ab88-574a-473c-aab5-bf977973a9d8-httpd-run\") pod \"bca4ab88-574a-473c-aab5-bf977973a9d8\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.953327 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkfld\" (UniqueName: \"kubernetes.io/projected/bca4ab88-574a-473c-aab5-bf977973a9d8-kube-api-access-lkfld\") pod \"bca4ab88-574a-473c-aab5-bf977973a9d8\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.953376 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bca4ab88-574a-473c-aab5-bf977973a9d8-logs\") pod \"bca4ab88-574a-473c-aab5-bf977973a9d8\" (UID: \"bca4ab88-574a-473c-aab5-bf977973a9d8\") " Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.954269 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bca4ab88-574a-473c-aab5-bf977973a9d8-logs" (OuterVolumeSpecName: "logs") pod "bca4ab88-574a-473c-aab5-bf977973a9d8" (UID: "bca4ab88-574a-473c-aab5-bf977973a9d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.954554 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bca4ab88-574a-473c-aab5-bf977973a9d8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bca4ab88-574a-473c-aab5-bf977973a9d8" (UID: "bca4ab88-574a-473c-aab5-bf977973a9d8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.965443 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "bca4ab88-574a-473c-aab5-bf977973a9d8" (UID: "bca4ab88-574a-473c-aab5-bf977973a9d8"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.966024 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bca4ab88-574a-473c-aab5-bf977973a9d8-kube-api-access-lkfld" (OuterVolumeSpecName: "kube-api-access-lkfld") pod "bca4ab88-574a-473c-aab5-bf977973a9d8" (UID: "bca4ab88-574a-473c-aab5-bf977973a9d8"). InnerVolumeSpecName "kube-api-access-lkfld". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.978791 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-scripts" (OuterVolumeSpecName: "scripts") pod "bca4ab88-574a-473c-aab5-bf977973a9d8" (UID: "bca4ab88-574a-473c-aab5-bf977973a9d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:20 crc kubenswrapper[4743]: I1125 16:17:20.997007 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bca4ab88-574a-473c-aab5-bf977973a9d8" (UID: "bca4ab88-574a-473c-aab5-bf977973a9d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.020190 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-config-data" (OuterVolumeSpecName: "config-data") pod "bca4ab88-574a-473c-aab5-bf977973a9d8" (UID: "bca4ab88-574a-473c-aab5-bf977973a9d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.050639 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bca4ab88-574a-473c-aab5-bf977973a9d8" (UID: "bca4ab88-574a-473c-aab5-bf977973a9d8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.055629 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.055653 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.055685 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.055695 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.055705 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bca4ab88-574a-473c-aab5-bf977973a9d8-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.055714 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkfld\" (UniqueName: \"kubernetes.io/projected/bca4ab88-574a-473c-aab5-bf977973a9d8-kube-api-access-lkfld\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.055723 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bca4ab88-574a-473c-aab5-bf977973a9d8-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.055731 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bca4ab88-574a-473c-aab5-bf977973a9d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.074450 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.157908 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.459398 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bca4ab88-574a-473c-aab5-bf977973a9d8","Type":"ContainerDied","Data":"287af40f6e3be8c688110947e6748edaed0074cbbcdab2629245c61f3394194d"} Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.459458 4743 scope.go:117] "RemoveContainer" containerID="49f7563dc2a3fe63fc0b75f6c8e23719ce775916bc28f8c6f2f4ae6c5a107d03" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.459622 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.520031 4743 scope.go:117] "RemoveContainer" containerID="83acb6f7093f91075ad4ae7ba0181889d77738a2fed5ab3b50c357dfcd1d35e4" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.521167 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.534444 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.556558 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:17:21 crc kubenswrapper[4743]: E1125 16:17:21.557102 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca4ab88-574a-473c-aab5-bf977973a9d8" containerName="glance-httpd" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.557130 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca4ab88-574a-473c-aab5-bf977973a9d8" containerName="glance-httpd" Nov 25 16:17:21 crc kubenswrapper[4743]: E1125 16:17:21.557158 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca4ab88-574a-473c-aab5-bf977973a9d8" containerName="glance-log" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.557167 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca4ab88-574a-473c-aab5-bf977973a9d8" containerName="glance-log" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.557487 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bca4ab88-574a-473c-aab5-bf977973a9d8" containerName="glance-log" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.557516 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bca4ab88-574a-473c-aab5-bf977973a9d8" containerName="glance-httpd" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.558570 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.560715 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.560745 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.569750 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.668627 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsz84\" (UniqueName: \"kubernetes.io/projected/791e2d3a-4b72-42dc-9df0-0a185817f347-kube-api-access-tsz84\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.668890 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791e2d3a-4b72-42dc-9df0-0a185817f347-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.668996 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791e2d3a-4b72-42dc-9df0-0a185817f347-config-data\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.669139 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/791e2d3a-4b72-42dc-9df0-0a185817f347-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.669217 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/791e2d3a-4b72-42dc-9df0-0a185817f347-scripts\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.669319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.669455 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/791e2d3a-4b72-42dc-9df0-0a185817f347-logs\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.669550 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/791e2d3a-4b72-42dc-9df0-0a185817f347-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.771960 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/791e2d3a-4b72-42dc-9df0-0a185817f347-scripts\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.772129 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.772292 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.772962 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/791e2d3a-4b72-42dc-9df0-0a185817f347-logs\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.773071 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/791e2d3a-4b72-42dc-9df0-0a185817f347-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.773161 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsz84\" (UniqueName: \"kubernetes.io/projected/791e2d3a-4b72-42dc-9df0-0a185817f347-kube-api-access-tsz84\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.773218 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791e2d3a-4b72-42dc-9df0-0a185817f347-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.773290 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791e2d3a-4b72-42dc-9df0-0a185817f347-config-data\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.773415 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/791e2d3a-4b72-42dc-9df0-0a185817f347-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.773549 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/791e2d3a-4b72-42dc-9df0-0a185817f347-logs\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.777889 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/791e2d3a-4b72-42dc-9df0-0a185817f347-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.784168 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/791e2d3a-4b72-42dc-9df0-0a185817f347-scripts\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.785504 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/791e2d3a-4b72-42dc-9df0-0a185817f347-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.785714 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791e2d3a-4b72-42dc-9df0-0a185817f347-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.788745 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bca4ab88-574a-473c-aab5-bf977973a9d8" path="/var/lib/kubelet/pods/bca4ab88-574a-473c-aab5-bf977973a9d8/volumes" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.789306 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791e2d3a-4b72-42dc-9df0-0a185817f347-config-data\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.790451 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsz84\" (UniqueName: \"kubernetes.io/projected/791e2d3a-4b72-42dc-9df0-0a185817f347-kube-api-access-tsz84\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.805976 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"791e2d3a-4b72-42dc-9df0-0a185817f347\") " pod="openstack/glance-default-external-api-0" Nov 25 16:17:21 crc kubenswrapper[4743]: I1125 16:17:21.878058 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.100912 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.182420 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-config-data\") pod \"f117987a-97b8-4338-a8e2-0e298028faab\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.182659 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgwhg\" (UniqueName: \"kubernetes.io/projected/f117987a-97b8-4338-a8e2-0e298028faab-kube-api-access-mgwhg\") pod \"f117987a-97b8-4338-a8e2-0e298028faab\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.182727 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-combined-ca-bundle\") pod \"f117987a-97b8-4338-a8e2-0e298028faab\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.182851 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"f117987a-97b8-4338-a8e2-0e298028faab\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.182904 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-internal-tls-certs\") pod \"f117987a-97b8-4338-a8e2-0e298028faab\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.182989 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f117987a-97b8-4338-a8e2-0e298028faab-logs\") pod \"f117987a-97b8-4338-a8e2-0e298028faab\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.183024 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f117987a-97b8-4338-a8e2-0e298028faab-httpd-run\") pod \"f117987a-97b8-4338-a8e2-0e298028faab\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.184558 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-scripts\") pod \"f117987a-97b8-4338-a8e2-0e298028faab\" (UID: \"f117987a-97b8-4338-a8e2-0e298028faab\") " Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.186491 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f117987a-97b8-4338-a8e2-0e298028faab-kube-api-access-mgwhg" (OuterVolumeSpecName: "kube-api-access-mgwhg") pod "f117987a-97b8-4338-a8e2-0e298028faab" (UID: "f117987a-97b8-4338-a8e2-0e298028faab"). InnerVolumeSpecName "kube-api-access-mgwhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.186586 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f117987a-97b8-4338-a8e2-0e298028faab-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f117987a-97b8-4338-a8e2-0e298028faab" (UID: "f117987a-97b8-4338-a8e2-0e298028faab"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.187130 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f117987a-97b8-4338-a8e2-0e298028faab-logs" (OuterVolumeSpecName: "logs") pod "f117987a-97b8-4338-a8e2-0e298028faab" (UID: "f117987a-97b8-4338-a8e2-0e298028faab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.189836 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "f117987a-97b8-4338-a8e2-0e298028faab" (UID: "f117987a-97b8-4338-a8e2-0e298028faab"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.192046 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-scripts" (OuterVolumeSpecName: "scripts") pod "f117987a-97b8-4338-a8e2-0e298028faab" (UID: "f117987a-97b8-4338-a8e2-0e298028faab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.219065 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f117987a-97b8-4338-a8e2-0e298028faab" (UID: "f117987a-97b8-4338-a8e2-0e298028faab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.237358 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f117987a-97b8-4338-a8e2-0e298028faab" (UID: "f117987a-97b8-4338-a8e2-0e298028faab"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.239562 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-config-data" (OuterVolumeSpecName: "config-data") pod "f117987a-97b8-4338-a8e2-0e298028faab" (UID: "f117987a-97b8-4338-a8e2-0e298028faab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.287751 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgwhg\" (UniqueName: \"kubernetes.io/projected/f117987a-97b8-4338-a8e2-0e298028faab-kube-api-access-mgwhg\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.287790 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.287830 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.287841 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.287854 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f117987a-97b8-4338-a8e2-0e298028faab-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.287865 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f117987a-97b8-4338-a8e2-0e298028faab-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.287876 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.287886 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f117987a-97b8-4338-a8e2-0e298028faab-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.313779 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.388919 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.408647 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.488072 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"791e2d3a-4b72-42dc-9df0-0a185817f347","Type":"ContainerStarted","Data":"e26f1883cb49cb183d306a8a52c11bfe33855885466572ddee3e3203f5cab52c"} Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.492412 4743 generic.go:334] "Generic (PLEG): container finished" podID="f117987a-97b8-4338-a8e2-0e298028faab" containerID="0b4b31082053d4d89977b5633c59d12364ab42641c5d6ee0f0e787c2d83cabc8" exitCode=0 Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.492447 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f117987a-97b8-4338-a8e2-0e298028faab","Type":"ContainerDied","Data":"0b4b31082053d4d89977b5633c59d12364ab42641c5d6ee0f0e787c2d83cabc8"} Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.492469 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f117987a-97b8-4338-a8e2-0e298028faab","Type":"ContainerDied","Data":"35ded1038fd75dbddd18e74dcd9113dbb0a6d981b65bb13e10a6f868b6d5d48c"} Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.492485 4743 scope.go:117] "RemoveContainer" containerID="0b4b31082053d4d89977b5633c59d12364ab42641c5d6ee0f0e787c2d83cabc8" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.492498 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.578759 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.587114 4743 scope.go:117] "RemoveContainer" containerID="3f32e61e0bad49765dd62176b4d938c99b52e3fe309342ba5c89cac80eb5c22b" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.592876 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.601760 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 16:17:22 crc kubenswrapper[4743]: E1125 16:17:22.602263 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f117987a-97b8-4338-a8e2-0e298028faab" containerName="glance-log" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.602280 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f117987a-97b8-4338-a8e2-0e298028faab" containerName="glance-log" Nov 25 16:17:22 crc kubenswrapper[4743]: E1125 16:17:22.602295 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f117987a-97b8-4338-a8e2-0e298028faab" containerName="glance-httpd" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.602303 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f117987a-97b8-4338-a8e2-0e298028faab" containerName="glance-httpd" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.602507 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f117987a-97b8-4338-a8e2-0e298028faab" containerName="glance-log" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.602526 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f117987a-97b8-4338-a8e2-0e298028faab" containerName="glance-httpd" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.604323 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.608929 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.609185 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.612285 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.619713 4743 scope.go:117] "RemoveContainer" containerID="0b4b31082053d4d89977b5633c59d12364ab42641c5d6ee0f0e787c2d83cabc8" Nov 25 16:17:22 crc kubenswrapper[4743]: E1125 16:17:22.627962 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b4b31082053d4d89977b5633c59d12364ab42641c5d6ee0f0e787c2d83cabc8\": container with ID starting with 0b4b31082053d4d89977b5633c59d12364ab42641c5d6ee0f0e787c2d83cabc8 not found: ID does not exist" containerID="0b4b31082053d4d89977b5633c59d12364ab42641c5d6ee0f0e787c2d83cabc8" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.627999 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b4b31082053d4d89977b5633c59d12364ab42641c5d6ee0f0e787c2d83cabc8"} err="failed to get container status \"0b4b31082053d4d89977b5633c59d12364ab42641c5d6ee0f0e787c2d83cabc8\": rpc error: code = NotFound desc = could not find container \"0b4b31082053d4d89977b5633c59d12364ab42641c5d6ee0f0e787c2d83cabc8\": container with ID starting with 0b4b31082053d4d89977b5633c59d12364ab42641c5d6ee0f0e787c2d83cabc8 not found: ID does not exist" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.628025 4743 scope.go:117] "RemoveContainer" containerID="3f32e61e0bad49765dd62176b4d938c99b52e3fe309342ba5c89cac80eb5c22b" Nov 25 16:17:22 crc kubenswrapper[4743]: E1125 16:17:22.628410 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f32e61e0bad49765dd62176b4d938c99b52e3fe309342ba5c89cac80eb5c22b\": container with ID starting with 3f32e61e0bad49765dd62176b4d938c99b52e3fe309342ba5c89cac80eb5c22b not found: ID does not exist" containerID="3f32e61e0bad49765dd62176b4d938c99b52e3fe309342ba5c89cac80eb5c22b" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.628431 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f32e61e0bad49765dd62176b4d938c99b52e3fe309342ba5c89cac80eb5c22b"} err="failed to get container status \"3f32e61e0bad49765dd62176b4d938c99b52e3fe309342ba5c89cac80eb5c22b\": rpc error: code = NotFound desc = could not find container \"3f32e61e0bad49765dd62176b4d938c99b52e3fe309342ba5c89cac80eb5c22b\": container with ID starting with 3f32e61e0bad49765dd62176b4d938c99b52e3fe309342ba5c89cac80eb5c22b not found: ID does not exist" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.693498 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d575a4-6889-4bf6-ad82-4c7e756607d2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.694004 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07d575a4-6889-4bf6-ad82-4c7e756607d2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.694035 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07d575a4-6889-4bf6-ad82-4c7e756607d2-logs\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.694067 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.694097 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07d575a4-6889-4bf6-ad82-4c7e756607d2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.694155 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d575a4-6889-4bf6-ad82-4c7e756607d2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.694187 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k7bj\" (UniqueName: \"kubernetes.io/projected/07d575a4-6889-4bf6-ad82-4c7e756607d2-kube-api-access-8k7bj\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.694240 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07d575a4-6889-4bf6-ad82-4c7e756607d2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.796351 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d575a4-6889-4bf6-ad82-4c7e756607d2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.796483 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07d575a4-6889-4bf6-ad82-4c7e756607d2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.796501 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07d575a4-6889-4bf6-ad82-4c7e756607d2-logs\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.796556 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.796695 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.797043 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07d575a4-6889-4bf6-ad82-4c7e756607d2-logs\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.797202 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07d575a4-6889-4bf6-ad82-4c7e756607d2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.797606 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07d575a4-6889-4bf6-ad82-4c7e756607d2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.797701 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d575a4-6889-4bf6-ad82-4c7e756607d2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.797738 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k7bj\" (UniqueName: \"kubernetes.io/projected/07d575a4-6889-4bf6-ad82-4c7e756607d2-kube-api-access-8k7bj\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.797760 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07d575a4-6889-4bf6-ad82-4c7e756607d2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.802178 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d575a4-6889-4bf6-ad82-4c7e756607d2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.802906 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07d575a4-6889-4bf6-ad82-4c7e756607d2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.819538 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d575a4-6889-4bf6-ad82-4c7e756607d2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.820058 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07d575a4-6889-4bf6-ad82-4c7e756607d2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.822367 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k7bj\" (UniqueName: \"kubernetes.io/projected/07d575a4-6889-4bf6-ad82-4c7e756607d2-kube-api-access-8k7bj\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.833747 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"07d575a4-6889-4bf6-ad82-4c7e756607d2\") " pod="openstack/glance-default-internal-api-0" Nov 25 16:17:22 crc kubenswrapper[4743]: I1125 16:17:22.933019 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 16:17:23 crc kubenswrapper[4743]: I1125 16:17:23.462564 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 16:17:23 crc kubenswrapper[4743]: W1125 16:17:23.472873 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07d575a4_6889_4bf6_ad82_4c7e756607d2.slice/crio-462d40bc055bbfb0ee65728a0843d4cccfbacf3b551dfee4f8a2259f00c59308 WatchSource:0}: Error finding container 462d40bc055bbfb0ee65728a0843d4cccfbacf3b551dfee4f8a2259f00c59308: Status 404 returned error can't find the container with id 462d40bc055bbfb0ee65728a0843d4cccfbacf3b551dfee4f8a2259f00c59308 Nov 25 16:17:23 crc kubenswrapper[4743]: I1125 16:17:23.506754 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"791e2d3a-4b72-42dc-9df0-0a185817f347","Type":"ContainerStarted","Data":"3bcc895170bbb2fc5069a50b1abcc1913e33be27ed1654ba15d35ce44d8874a4"} Nov 25 16:17:23 crc kubenswrapper[4743]: I1125 16:17:23.512083 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"07d575a4-6889-4bf6-ad82-4c7e756607d2","Type":"ContainerStarted","Data":"462d40bc055bbfb0ee65728a0843d4cccfbacf3b551dfee4f8a2259f00c59308"} Nov 25 16:17:23 crc kubenswrapper[4743]: I1125 16:17:23.790114 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f117987a-97b8-4338-a8e2-0e298028faab" path="/var/lib/kubelet/pods/f117987a-97b8-4338-a8e2-0e298028faab/volumes" Nov 25 16:17:24 crc kubenswrapper[4743]: I1125 16:17:24.521995 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"791e2d3a-4b72-42dc-9df0-0a185817f347","Type":"ContainerStarted","Data":"67a6bce17154eda1325454a891125f4aea20360b4a593dda779184038d812c41"} Nov 25 16:17:24 crc kubenswrapper[4743]: I1125 16:17:24.523998 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"07d575a4-6889-4bf6-ad82-4c7e756607d2","Type":"ContainerStarted","Data":"04f830e77068d93a900c522cd12ba874ed5b540cd23668403981939bdd8ab699"} Nov 25 16:17:24 crc kubenswrapper[4743]: I1125 16:17:24.524047 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"07d575a4-6889-4bf6-ad82-4c7e756607d2","Type":"ContainerStarted","Data":"527d65c5afe503fe880bbda36930922435301ed49ae01c7d47b8cac19478d42e"} Nov 25 16:17:24 crc kubenswrapper[4743]: I1125 16:17:24.544993 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.544975514 podStartE2EDuration="3.544975514s" podCreationTimestamp="2025-11-25 16:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:17:24.540626108 +0000 UTC m=+1123.662465677" watchObservedRunningTime="2025-11-25 16:17:24.544975514 +0000 UTC m=+1123.666815063" Nov 25 16:17:24 crc kubenswrapper[4743]: I1125 16:17:24.572802 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.572785067 podStartE2EDuration="2.572785067s" podCreationTimestamp="2025-11-25 16:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:17:24.56585149 +0000 UTC m=+1123.687691059" watchObservedRunningTime="2025-11-25 16:17:24.572785067 +0000 UTC m=+1123.694624616" Nov 25 16:17:26 crc kubenswrapper[4743]: I1125 16:17:26.542930 4743 generic.go:334] "Generic (PLEG): container finished" podID="84d49d73-f8be-44ee-a3fc-37612fdb9440" containerID="9ca9adb39873a76fd53f3d453014253693331a45377595054dd03d9f710977f8" exitCode=0 Nov 25 16:17:26 crc kubenswrapper[4743]: I1125 16:17:26.543003 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9r9vw" event={"ID":"84d49d73-f8be-44ee-a3fc-37612fdb9440","Type":"ContainerDied","Data":"9ca9adb39873a76fd53f3d453014253693331a45377595054dd03d9f710977f8"} Nov 25 16:17:27 crc kubenswrapper[4743]: I1125 16:17:27.935635 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.100529 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-config-data\") pod \"84d49d73-f8be-44ee-a3fc-37612fdb9440\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.100618 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-scripts\") pod \"84d49d73-f8be-44ee-a3fc-37612fdb9440\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.100706 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-combined-ca-bundle\") pod \"84d49d73-f8be-44ee-a3fc-37612fdb9440\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.100806 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zngcr\" (UniqueName: \"kubernetes.io/projected/84d49d73-f8be-44ee-a3fc-37612fdb9440-kube-api-access-zngcr\") pod \"84d49d73-f8be-44ee-a3fc-37612fdb9440\" (UID: \"84d49d73-f8be-44ee-a3fc-37612fdb9440\") " Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.107618 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d49d73-f8be-44ee-a3fc-37612fdb9440-kube-api-access-zngcr" (OuterVolumeSpecName: "kube-api-access-zngcr") pod "84d49d73-f8be-44ee-a3fc-37612fdb9440" (UID: "84d49d73-f8be-44ee-a3fc-37612fdb9440"). InnerVolumeSpecName "kube-api-access-zngcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.108415 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-scripts" (OuterVolumeSpecName: "scripts") pod "84d49d73-f8be-44ee-a3fc-37612fdb9440" (UID: "84d49d73-f8be-44ee-a3fc-37612fdb9440"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.133906 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-config-data" (OuterVolumeSpecName: "config-data") pod "84d49d73-f8be-44ee-a3fc-37612fdb9440" (UID: "84d49d73-f8be-44ee-a3fc-37612fdb9440"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.135976 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84d49d73-f8be-44ee-a3fc-37612fdb9440" (UID: "84d49d73-f8be-44ee-a3fc-37612fdb9440"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.203153 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zngcr\" (UniqueName: \"kubernetes.io/projected/84d49d73-f8be-44ee-a3fc-37612fdb9440-kube-api-access-zngcr\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.203188 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.203200 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.203211 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84d49d73-f8be-44ee-a3fc-37612fdb9440-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.560965 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9r9vw" event={"ID":"84d49d73-f8be-44ee-a3fc-37612fdb9440","Type":"ContainerDied","Data":"ac06c25db6ad8755c4af4790f22a2ed11c1202590b1f58bc915f2e66094bf7d3"} Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.561025 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac06c25db6ad8755c4af4790f22a2ed11c1202590b1f58bc915f2e66094bf7d3" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.561067 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9r9vw" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.660032 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 25 16:17:28 crc kubenswrapper[4743]: E1125 16:17:28.660514 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d49d73-f8be-44ee-a3fc-37612fdb9440" containerName="nova-cell0-conductor-db-sync" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.660537 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d49d73-f8be-44ee-a3fc-37612fdb9440" containerName="nova-cell0-conductor-db-sync" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.660818 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d49d73-f8be-44ee-a3fc-37612fdb9440" containerName="nova-cell0-conductor-db-sync" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.661655 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.663518 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5zkjx" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.665912 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.670843 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.814384 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b97cf\" (UniqueName: \"kubernetes.io/projected/da59725d-9914-40d1-b70b-57df96de1db2-kube-api-access-b97cf\") pod \"nova-cell0-conductor-0\" (UID: \"da59725d-9914-40d1-b70b-57df96de1db2\") " pod="openstack/nova-cell0-conductor-0" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.814485 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da59725d-9914-40d1-b70b-57df96de1db2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"da59725d-9914-40d1-b70b-57df96de1db2\") " pod="openstack/nova-cell0-conductor-0" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.814550 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da59725d-9914-40d1-b70b-57df96de1db2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"da59725d-9914-40d1-b70b-57df96de1db2\") " pod="openstack/nova-cell0-conductor-0" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.916148 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b97cf\" (UniqueName: \"kubernetes.io/projected/da59725d-9914-40d1-b70b-57df96de1db2-kube-api-access-b97cf\") pod \"nova-cell0-conductor-0\" (UID: \"da59725d-9914-40d1-b70b-57df96de1db2\") " pod="openstack/nova-cell0-conductor-0" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.916287 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da59725d-9914-40d1-b70b-57df96de1db2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"da59725d-9914-40d1-b70b-57df96de1db2\") " pod="openstack/nova-cell0-conductor-0" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.916331 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da59725d-9914-40d1-b70b-57df96de1db2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"da59725d-9914-40d1-b70b-57df96de1db2\") " pod="openstack/nova-cell0-conductor-0" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.920120 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da59725d-9914-40d1-b70b-57df96de1db2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"da59725d-9914-40d1-b70b-57df96de1db2\") " pod="openstack/nova-cell0-conductor-0" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.920514 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da59725d-9914-40d1-b70b-57df96de1db2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"da59725d-9914-40d1-b70b-57df96de1db2\") " pod="openstack/nova-cell0-conductor-0" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.932436 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b97cf\" (UniqueName: \"kubernetes.io/projected/da59725d-9914-40d1-b70b-57df96de1db2-kube-api-access-b97cf\") pod \"nova-cell0-conductor-0\" (UID: \"da59725d-9914-40d1-b70b-57df96de1db2\") " pod="openstack/nova-cell0-conductor-0" Nov 25 16:17:28 crc kubenswrapper[4743]: I1125 16:17:28.980889 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 25 16:17:29 crc kubenswrapper[4743]: I1125 16:17:29.405684 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 25 16:17:29 crc kubenswrapper[4743]: I1125 16:17:29.570676 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"da59725d-9914-40d1-b70b-57df96de1db2","Type":"ContainerStarted","Data":"4f36f9e8db62263ff6df052ead8aa77e75a55532b2637c39ab3711caf5b5192e"} Nov 25 16:17:29 crc kubenswrapper[4743]: I1125 16:17:29.570728 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"da59725d-9914-40d1-b70b-57df96de1db2","Type":"ContainerStarted","Data":"c145c3eb39e825a143dfa473cf2a19d313837bfa1bf2bfb5152b40be35cb0b3d"} Nov 25 16:17:29 crc kubenswrapper[4743]: I1125 16:17:29.570820 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 25 16:17:31 crc kubenswrapper[4743]: I1125 16:17:31.879362 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 25 16:17:31 crc kubenswrapper[4743]: I1125 16:17:31.879859 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 25 16:17:31 crc kubenswrapper[4743]: I1125 16:17:31.911579 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 25 16:17:31 crc kubenswrapper[4743]: I1125 16:17:31.927392 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 25 16:17:31 crc kubenswrapper[4743]: I1125 16:17:31.935088 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.935070848 podStartE2EDuration="3.935070848s" podCreationTimestamp="2025-11-25 16:17:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:17:29.587439943 +0000 UTC m=+1128.709279502" watchObservedRunningTime="2025-11-25 16:17:31.935070848 +0000 UTC m=+1131.056910397" Nov 25 16:17:32 crc kubenswrapper[4743]: I1125 16:17:32.596712 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 25 16:17:32 crc kubenswrapper[4743]: I1125 16:17:32.596763 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 25 16:17:32 crc kubenswrapper[4743]: I1125 16:17:32.933788 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 25 16:17:32 crc kubenswrapper[4743]: I1125 16:17:32.933840 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 25 16:17:32 crc kubenswrapper[4743]: I1125 16:17:32.961580 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 25 16:17:32 crc kubenswrapper[4743]: I1125 16:17:32.987483 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 25 16:17:33 crc kubenswrapper[4743]: I1125 16:17:33.603954 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 25 16:17:33 crc kubenswrapper[4743]: I1125 16:17:33.604332 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 25 16:17:34 crc kubenswrapper[4743]: I1125 16:17:34.377111 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 25 16:17:34 crc kubenswrapper[4743]: I1125 16:17:34.381172 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 25 16:17:35 crc kubenswrapper[4743]: I1125 16:17:35.480868 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 25 16:17:35 crc kubenswrapper[4743]: I1125 16:17:35.559226 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 25 16:17:35 crc kubenswrapper[4743]: I1125 16:17:35.832822 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.020177 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.572123 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-pccxd"] Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.574036 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.576303 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.577580 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.595881 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pccxd"] Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.621516 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-config-data\") pod \"nova-cell0-cell-mapping-pccxd\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.621752 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tmb2\" (UniqueName: \"kubernetes.io/projected/451a339d-6ea1-4ce0-a550-fcaad7d83f28-kube-api-access-7tmb2\") pod \"nova-cell0-cell-mapping-pccxd\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.621862 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-scripts\") pod \"nova-cell0-cell-mapping-pccxd\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.622132 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pccxd\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.723319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tmb2\" (UniqueName: \"kubernetes.io/projected/451a339d-6ea1-4ce0-a550-fcaad7d83f28-kube-api-access-7tmb2\") pod \"nova-cell0-cell-mapping-pccxd\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.723375 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-scripts\") pod \"nova-cell0-cell-mapping-pccxd\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.723444 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pccxd\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.723512 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-config-data\") pod \"nova-cell0-cell-mapping-pccxd\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.733941 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-scripts\") pod \"nova-cell0-cell-mapping-pccxd\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.735252 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-config-data\") pod \"nova-cell0-cell-mapping-pccxd\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.751239 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pccxd\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.752198 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tmb2\" (UniqueName: \"kubernetes.io/projected/451a339d-6ea1-4ce0-a550-fcaad7d83f28-kube-api-access-7tmb2\") pod \"nova-cell0-cell-mapping-pccxd\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.757060 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.758860 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.761005 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.775658 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.776970 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.779726 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.796365 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.824216 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.825177 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wm7x\" (UniqueName: \"kubernetes.io/projected/7cfa1f46-e909-4131-97c9-a8edde9e3a21-kube-api-access-5wm7x\") pod \"nova-api-0\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " pod="openstack/nova-api-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.825276 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfa1f46-e909-4131-97c9-a8edde9e3a21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " pod="openstack/nova-api-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.825312 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cfa1f46-e909-4131-97c9-a8edde9e3a21-logs\") pod \"nova-api-0\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " pod="openstack/nova-api-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.825343 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815b7932-2bb0-47c6-a8e2-c182484259c4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"815b7932-2bb0-47c6-a8e2-c182484259c4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.825367 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815b7932-2bb0-47c6-a8e2-c182484259c4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"815b7932-2bb0-47c6-a8e2-c182484259c4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.825418 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cfa1f46-e909-4131-97c9-a8edde9e3a21-config-data\") pod \"nova-api-0\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " pod="openstack/nova-api-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.825469 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtjlq\" (UniqueName: \"kubernetes.io/projected/815b7932-2bb0-47c6-a8e2-c182484259c4-kube-api-access-mtjlq\") pod \"nova-cell1-novncproxy-0\" (UID: \"815b7932-2bb0-47c6-a8e2-c182484259c4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.903025 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.918185 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.919969 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.922829 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.929253 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.931752 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wm7x\" (UniqueName: \"kubernetes.io/projected/7cfa1f46-e909-4131-97c9-a8edde9e3a21-kube-api-access-5wm7x\") pod \"nova-api-0\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " pod="openstack/nova-api-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.931830 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfa1f46-e909-4131-97c9-a8edde9e3a21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " pod="openstack/nova-api-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.931852 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cfa1f46-e909-4131-97c9-a8edde9e3a21-logs\") pod \"nova-api-0\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " pod="openstack/nova-api-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.931898 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815b7932-2bb0-47c6-a8e2-c182484259c4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"815b7932-2bb0-47c6-a8e2-c182484259c4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.931927 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-logs\") pod \"nova-metadata-0\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " pod="openstack/nova-metadata-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.931947 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815b7932-2bb0-47c6-a8e2-c182484259c4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"815b7932-2bb0-47c6-a8e2-c182484259c4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.931983 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22ltg\" (UniqueName: \"kubernetes.io/projected/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-kube-api-access-22ltg\") pod \"nova-metadata-0\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " pod="openstack/nova-metadata-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.932006 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-config-data\") pod \"nova-metadata-0\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " pod="openstack/nova-metadata-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.932040 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cfa1f46-e909-4131-97c9-a8edde9e3a21-config-data\") pod \"nova-api-0\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " pod="openstack/nova-api-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.932070 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtjlq\" (UniqueName: \"kubernetes.io/projected/815b7932-2bb0-47c6-a8e2-c182484259c4-kube-api-access-mtjlq\") pod \"nova-cell1-novncproxy-0\" (UID: \"815b7932-2bb0-47c6-a8e2-c182484259c4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.932091 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " pod="openstack/nova-metadata-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.934076 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cfa1f46-e909-4131-97c9-a8edde9e3a21-logs\") pod \"nova-api-0\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " pod="openstack/nova-api-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.953287 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cfa1f46-e909-4131-97c9-a8edde9e3a21-config-data\") pod \"nova-api-0\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " pod="openstack/nova-api-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.958136 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815b7932-2bb0-47c6-a8e2-c182484259c4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"815b7932-2bb0-47c6-a8e2-c182484259c4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.958193 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.959303 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.963891 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.983569 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wm7x\" (UniqueName: \"kubernetes.io/projected/7cfa1f46-e909-4131-97c9-a8edde9e3a21-kube-api-access-5wm7x\") pod \"nova-api-0\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " pod="openstack/nova-api-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.984129 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfa1f46-e909-4131-97c9-a8edde9e3a21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " pod="openstack/nova-api-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.984508 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815b7932-2bb0-47c6-a8e2-c182484259c4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"815b7932-2bb0-47c6-a8e2-c182484259c4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:17:39 crc kubenswrapper[4743]: I1125 16:17:39.993400 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.002991 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtjlq\" (UniqueName: \"kubernetes.io/projected/815b7932-2bb0-47c6-a8e2-c182484259c4-kube-api-access-mtjlq\") pod \"nova-cell1-novncproxy-0\" (UID: \"815b7932-2bb0-47c6-a8e2-c182484259c4\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.025299 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-2d76n"] Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.027522 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.038140 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-config-data\") pod \"nova-scheduler-0\" (UID: \"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.038222 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.038247 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-logs\") pod \"nova-metadata-0\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " pod="openstack/nova-metadata-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.038285 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22ltg\" (UniqueName: \"kubernetes.io/projected/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-kube-api-access-22ltg\") pod \"nova-metadata-0\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " pod="openstack/nova-metadata-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.038306 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-config-data\") pod \"nova-metadata-0\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " pod="openstack/nova-metadata-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.038347 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnd67\" (UniqueName: \"kubernetes.io/projected/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-kube-api-access-vnd67\") pod \"nova-scheduler-0\" (UID: \"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.038369 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " pod="openstack/nova-metadata-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.040070 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-logs\") pod \"nova-metadata-0\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " pod="openstack/nova-metadata-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.042871 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-2d76n"] Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.052827 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-config-data\") pod \"nova-metadata-0\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " pod="openstack/nova-metadata-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.053364 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " pod="openstack/nova-metadata-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.066482 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22ltg\" (UniqueName: \"kubernetes.io/projected/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-kube-api-access-22ltg\") pod \"nova-metadata-0\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " pod="openstack/nova-metadata-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.072134 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.139841 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.140220 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.140435 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.140675 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-config\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.140726 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnd67\" (UniqueName: \"kubernetes.io/projected/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-kube-api-access-vnd67\") pod \"nova-scheduler-0\" (UID: \"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.140844 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-dns-svc\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.141017 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cz95\" (UniqueName: \"kubernetes.io/projected/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-kube-api-access-7cz95\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.141043 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-config-data\") pod \"nova-scheduler-0\" (UID: \"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.141077 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.146274 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.146916 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-config-data\") pod \"nova-scheduler-0\" (UID: \"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.159997 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnd67\" (UniqueName: \"kubernetes.io/projected/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-kube-api-access-vnd67\") pod \"nova-scheduler-0\" (UID: \"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.167012 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.191897 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.243579 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.243676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-config\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.243725 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-dns-svc\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.243801 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cz95\" (UniqueName: \"kubernetes.io/projected/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-kube-api-access-7cz95\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.243826 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.243878 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.244696 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.245196 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.245697 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-config\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.246174 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-dns-svc\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.246967 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.268990 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cz95\" (UniqueName: \"kubernetes.io/projected/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-kube-api-access-7cz95\") pod \"dnsmasq-dns-757b4f8459-2d76n\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.383224 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.395730 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.483861 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pccxd"] Nov 25 16:17:40 crc kubenswrapper[4743]: W1125 16:17:40.492873 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod451a339d_6ea1_4ce0_a550_fcaad7d83f28.slice/crio-4522f204e37a9737e97b7d3db2c37067896984d7886f779f8f0ad2053dd360cb WatchSource:0}: Error finding container 4522f204e37a9737e97b7d3db2c37067896984d7886f779f8f0ad2053dd360cb: Status 404 returned error can't find the container with id 4522f204e37a9737e97b7d3db2c37067896984d7886f779f8f0ad2053dd360cb Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.670786 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xssnc"] Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.672522 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.681940 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.692974 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.693090 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.720263 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xssnc"] Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.783484 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-scripts\") pod \"nova-cell1-conductor-db-sync-xssnc\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.783619 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xssnc\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.783718 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-config-data\") pod \"nova-cell1-conductor-db-sync-xssnc\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.783783 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2twf6\" (UniqueName: \"kubernetes.io/projected/796f4930-dad8-4c02-9ffa-00df9a6689ff-kube-api-access-2twf6\") pod \"nova-cell1-conductor-db-sync-xssnc\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.789330 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pccxd" event={"ID":"451a339d-6ea1-4ce0-a550-fcaad7d83f28","Type":"ContainerStarted","Data":"4522f204e37a9737e97b7d3db2c37067896984d7886f779f8f0ad2053dd360cb"} Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.861878 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.886015 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-scripts\") pod \"nova-cell1-conductor-db-sync-xssnc\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.886096 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xssnc\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.886154 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-config-data\") pod \"nova-cell1-conductor-db-sync-xssnc\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.886193 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2twf6\" (UniqueName: \"kubernetes.io/projected/796f4930-dad8-4c02-9ffa-00df9a6689ff-kube-api-access-2twf6\") pod \"nova-cell1-conductor-db-sync-xssnc\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.899987 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-scripts\") pod \"nova-cell1-conductor-db-sync-xssnc\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.905759 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.907675 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xssnc\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.914546 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-config-data\") pod \"nova-cell1-conductor-db-sync-xssnc\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:17:40 crc kubenswrapper[4743]: I1125 16:17:40.915216 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2twf6\" (UniqueName: \"kubernetes.io/projected/796f4930-dad8-4c02-9ffa-00df9a6689ff-kube-api-access-2twf6\") pod \"nova-cell1-conductor-db-sync-xssnc\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:17:41 crc kubenswrapper[4743]: W1125 16:17:41.109477 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ac2c45f_7d0b_4af7_80a3_e4e0913f330d.slice/crio-030b7a595ef60fb2184dff57406fe15d960d830437946d4c91a6f8ea8fe1502b WatchSource:0}: Error finding container 030b7a595ef60fb2184dff57406fe15d960d830437946d4c91a6f8ea8fe1502b: Status 404 returned error can't find the container with id 030b7a595ef60fb2184dff57406fe15d960d830437946d4c91a6f8ea8fe1502b Nov 25 16:17:41 crc kubenswrapper[4743]: I1125 16:17:41.114487 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-2d76n"] Nov 25 16:17:41 crc kubenswrapper[4743]: I1125 16:17:41.151509 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:17:41 crc kubenswrapper[4743]: I1125 16:17:41.214898 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 16:17:41 crc kubenswrapper[4743]: W1125 16:17:41.223220 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1ac2b54_2938_4ec4_b0d3_3ca63d834ff6.slice/crio-ae5bb29138513ed5f0aa21a70cb817966596070ebf2b1b172f0314ac1ba50a30 WatchSource:0}: Error finding container ae5bb29138513ed5f0aa21a70cb817966596070ebf2b1b172f0314ac1ba50a30: Status 404 returned error can't find the container with id ae5bb29138513ed5f0aa21a70cb817966596070ebf2b1b172f0314ac1ba50a30 Nov 25 16:17:41 crc kubenswrapper[4743]: I1125 16:17:41.638817 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xssnc"] Nov 25 16:17:41 crc kubenswrapper[4743]: W1125 16:17:41.647906 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod796f4930_dad8_4c02_9ffa_00df9a6689ff.slice/crio-8d89dd55b0ae2b53b17b6715cd7a5bdb0f7d66e046549294c818d4d870e49f68 WatchSource:0}: Error finding container 8d89dd55b0ae2b53b17b6715cd7a5bdb0f7d66e046549294c818d4d870e49f68: Status 404 returned error can't find the container with id 8d89dd55b0ae2b53b17b6715cd7a5bdb0f7d66e046549294c818d4d870e49f68 Nov 25 16:17:41 crc kubenswrapper[4743]: I1125 16:17:41.801719 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-2d76n" event={"ID":"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d","Type":"ContainerStarted","Data":"3038fdd4d91301fa79578c6d00782348371e591df33c9757488a890ac3215783"} Nov 25 16:17:41 crc kubenswrapper[4743]: I1125 16:17:41.801771 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-2d76n" event={"ID":"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d","Type":"ContainerStarted","Data":"030b7a595ef60fb2184dff57406fe15d960d830437946d4c91a6f8ea8fe1502b"} Nov 25 16:17:41 crc kubenswrapper[4743]: I1125 16:17:41.804058 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6","Type":"ContainerStarted","Data":"ae5bb29138513ed5f0aa21a70cb817966596070ebf2b1b172f0314ac1ba50a30"} Nov 25 16:17:41 crc kubenswrapper[4743]: I1125 16:17:41.818380 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xssnc" event={"ID":"796f4930-dad8-4c02-9ffa-00df9a6689ff","Type":"ContainerStarted","Data":"8d89dd55b0ae2b53b17b6715cd7a5bdb0f7d66e046549294c818d4d870e49f68"} Nov 25 16:17:41 crc kubenswrapper[4743]: I1125 16:17:41.818445 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"815b7932-2bb0-47c6-a8e2-c182484259c4","Type":"ContainerStarted","Data":"63454015e6c14c3b25bd9a19a64790aa43f7f7bbd304ff4ceb9cdb2ace22e70e"} Nov 25 16:17:41 crc kubenswrapper[4743]: I1125 16:17:41.818465 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cfa1f46-e909-4131-97c9-a8edde9e3a21","Type":"ContainerStarted","Data":"1422f9b2b98dfc8015a2073160a6d7df6856909ef06aa1b9c35e2a19c7450940"} Nov 25 16:17:41 crc kubenswrapper[4743]: I1125 16:17:41.818489 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8","Type":"ContainerStarted","Data":"1f7bc345ce0523ec325bbdebc53cde0e39ef9978d5b68c30a81eabae309b4d42"} Nov 25 16:17:41 crc kubenswrapper[4743]: I1125 16:17:41.821906 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pccxd" event={"ID":"451a339d-6ea1-4ce0-a550-fcaad7d83f28","Type":"ContainerStarted","Data":"87336b290dd47e553fbd823716fa8fbfcb4d48cad500d15d9e16be29952f724e"} Nov 25 16:17:41 crc kubenswrapper[4743]: I1125 16:17:41.945324 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-pccxd" podStartSLOduration=2.94530514 podStartE2EDuration="2.94530514s" podCreationTimestamp="2025-11-25 16:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:17:41.934066998 +0000 UTC m=+1141.055906567" watchObservedRunningTime="2025-11-25 16:17:41.94530514 +0000 UTC m=+1141.067144689" Nov 25 16:17:42 crc kubenswrapper[4743]: I1125 16:17:42.843138 4743 generic.go:334] "Generic (PLEG): container finished" podID="5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" containerID="3038fdd4d91301fa79578c6d00782348371e591df33c9757488a890ac3215783" exitCode=0 Nov 25 16:17:42 crc kubenswrapper[4743]: I1125 16:17:42.843229 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-2d76n" event={"ID":"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d","Type":"ContainerDied","Data":"3038fdd4d91301fa79578c6d00782348371e591df33c9757488a890ac3215783"} Nov 25 16:17:42 crc kubenswrapper[4743]: I1125 16:17:42.845807 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xssnc" event={"ID":"796f4930-dad8-4c02-9ffa-00df9a6689ff","Type":"ContainerStarted","Data":"e88c6afbcb1a7d264540f7bf583e05d03e92134479a4a53c4fc27362be2d39b0"} Nov 25 16:17:43 crc kubenswrapper[4743]: I1125 16:17:43.459676 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:17:43 crc kubenswrapper[4743]: I1125 16:17:43.468507 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 16:17:43 crc kubenswrapper[4743]: I1125 16:17:43.871772 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xssnc" podStartSLOduration=3.871755424 podStartE2EDuration="3.871755424s" podCreationTimestamp="2025-11-25 16:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:17:43.871108083 +0000 UTC m=+1142.992947622" watchObservedRunningTime="2025-11-25 16:17:43.871755424 +0000 UTC m=+1142.993594963" Nov 25 16:17:44 crc kubenswrapper[4743]: I1125 16:17:44.867059 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-2d76n" event={"ID":"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d","Type":"ContainerStarted","Data":"7f29edb941d93fb623ce9677aecd6ad754dcf097395097dab9706bfc3e4b1c0c"} Nov 25 16:17:44 crc kubenswrapper[4743]: I1125 16:17:44.867480 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:44 crc kubenswrapper[4743]: I1125 16:17:44.892271 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-2d76n" podStartSLOduration=5.892250488 podStartE2EDuration="5.892250488s" podCreationTimestamp="2025-11-25 16:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:17:44.888615124 +0000 UTC m=+1144.010454693" watchObservedRunningTime="2025-11-25 16:17:44.892250488 +0000 UTC m=+1144.014090037" Nov 25 16:17:45 crc kubenswrapper[4743]: I1125 16:17:45.877670 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cfa1f46-e909-4131-97c9-a8edde9e3a21","Type":"ContainerStarted","Data":"d9e46b692cb2bd8c6e5be1cdda753ba5eb2d84f88edc02f0d58e44fb852d26d3"} Nov 25 16:17:45 crc kubenswrapper[4743]: I1125 16:17:45.878090 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cfa1f46-e909-4131-97c9-a8edde9e3a21","Type":"ContainerStarted","Data":"0a580f72fc8cc629f46a7b419f45c9b4fddca319d2891593b286cab4c8c8248f"} Nov 25 16:17:45 crc kubenswrapper[4743]: I1125 16:17:45.879423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8","Type":"ContainerStarted","Data":"7dd8c8a412d18cdf20ff28ebfe5b7e4f6ca76cb737624361b6f7bed227d68ccf"} Nov 25 16:17:45 crc kubenswrapper[4743]: I1125 16:17:45.879454 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c32a467a-c44a-40ef-a9a3-16d6e61a4cd8" containerName="nova-metadata-log" containerID="cri-o://ea7b9050d28f76913ed959c917818132aef1c9c135892f603456594f85a036b1" gracePeriod=30 Nov 25 16:17:45 crc kubenswrapper[4743]: I1125 16:17:45.879468 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8","Type":"ContainerStarted","Data":"ea7b9050d28f76913ed959c917818132aef1c9c135892f603456594f85a036b1"} Nov 25 16:17:45 crc kubenswrapper[4743]: I1125 16:17:45.879517 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c32a467a-c44a-40ef-a9a3-16d6e61a4cd8" containerName="nova-metadata-metadata" containerID="cri-o://7dd8c8a412d18cdf20ff28ebfe5b7e4f6ca76cb737624361b6f7bed227d68ccf" gracePeriod=30 Nov 25 16:17:45 crc kubenswrapper[4743]: I1125 16:17:45.881788 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6","Type":"ContainerStarted","Data":"02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea"} Nov 25 16:17:45 crc kubenswrapper[4743]: I1125 16:17:45.887242 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="815b7932-2bb0-47c6-a8e2-c182484259c4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://cbed122c43dc1a205f1b74e157100f22782c4010ea152a4d0a3ce8ef12e9f336" gracePeriod=30 Nov 25 16:17:45 crc kubenswrapper[4743]: I1125 16:17:45.887370 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"815b7932-2bb0-47c6-a8e2-c182484259c4","Type":"ContainerStarted","Data":"cbed122c43dc1a205f1b74e157100f22782c4010ea152a4d0a3ce8ef12e9f336"} Nov 25 16:17:45 crc kubenswrapper[4743]: I1125 16:17:45.906188 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.960016433 podStartE2EDuration="6.906166586s" podCreationTimestamp="2025-11-25 16:17:39 +0000 UTC" firstStartedPulling="2025-11-25 16:17:40.822388081 +0000 UTC m=+1139.944227630" lastFinishedPulling="2025-11-25 16:17:43.768538234 +0000 UTC m=+1142.890377783" observedRunningTime="2025-11-25 16:17:45.901238151 +0000 UTC m=+1145.023077710" watchObservedRunningTime="2025-11-25 16:17:45.906166586 +0000 UTC m=+1145.028006145" Nov 25 16:17:45 crc kubenswrapper[4743]: I1125 16:17:45.924786 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.293656724 podStartE2EDuration="6.924762019s" podCreationTimestamp="2025-11-25 16:17:39 +0000 UTC" firstStartedPulling="2025-11-25 16:17:40.914159041 +0000 UTC m=+1140.035998590" lastFinishedPulling="2025-11-25 16:17:44.545264346 +0000 UTC m=+1143.667103885" observedRunningTime="2025-11-25 16:17:45.919422972 +0000 UTC m=+1145.041262531" watchObservedRunningTime="2025-11-25 16:17:45.924762019 +0000 UTC m=+1145.046601568" Nov 25 16:17:45 crc kubenswrapper[4743]: I1125 16:17:45.939321 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.626339538 podStartE2EDuration="6.939299206s" podCreationTimestamp="2025-11-25 16:17:39 +0000 UTC" firstStartedPulling="2025-11-25 16:17:41.233833836 +0000 UTC m=+1140.355673385" lastFinishedPulling="2025-11-25 16:17:44.546793504 +0000 UTC m=+1143.668633053" observedRunningTime="2025-11-25 16:17:45.93529133 +0000 UTC m=+1145.057130889" watchObservedRunningTime="2025-11-25 16:17:45.939299206 +0000 UTC m=+1145.061138765" Nov 25 16:17:45 crc kubenswrapper[4743]: I1125 16:17:45.965868 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.516768119 podStartE2EDuration="6.965828179s" podCreationTimestamp="2025-11-25 16:17:39 +0000 UTC" firstStartedPulling="2025-11-25 16:17:40.735818523 +0000 UTC m=+1139.857658072" lastFinishedPulling="2025-11-25 16:17:44.184878583 +0000 UTC m=+1143.306718132" observedRunningTime="2025-11-25 16:17:45.959292953 +0000 UTC m=+1145.081132512" watchObservedRunningTime="2025-11-25 16:17:45.965828179 +0000 UTC m=+1145.087667728" Nov 25 16:17:46 crc kubenswrapper[4743]: I1125 16:17:46.813073 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:17:46 crc kubenswrapper[4743]: I1125 16:17:46.901339 4743 generic.go:334] "Generic (PLEG): container finished" podID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerID="08d01dab6fdc4f4476fa863d86caf7a2f0f0b1b9f755b31b7550b7f1ae57ac2a" exitCode=137 Nov 25 16:17:46 crc kubenswrapper[4743]: I1125 16:17:46.901402 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c08dfadc-c16a-4c82-be2d-318ab9aae386","Type":"ContainerDied","Data":"08d01dab6fdc4f4476fa863d86caf7a2f0f0b1b9f755b31b7550b7f1ae57ac2a"} Nov 25 16:17:46 crc kubenswrapper[4743]: I1125 16:17:46.901404 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:17:46 crc kubenswrapper[4743]: I1125 16:17:46.901428 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c08dfadc-c16a-4c82-be2d-318ab9aae386","Type":"ContainerDied","Data":"b8f7aeae6049264369f098a16d7ba5c4987412c4b5d100e40cd6dada30cee52c"} Nov 25 16:17:46 crc kubenswrapper[4743]: I1125 16:17:46.901445 4743 scope.go:117] "RemoveContainer" containerID="08d01dab6fdc4f4476fa863d86caf7a2f0f0b1b9f755b31b7550b7f1ae57ac2a" Nov 25 16:17:46 crc kubenswrapper[4743]: I1125 16:17:46.908380 4743 generic.go:334] "Generic (PLEG): container finished" podID="c32a467a-c44a-40ef-a9a3-16d6e61a4cd8" containerID="7dd8c8a412d18cdf20ff28ebfe5b7e4f6ca76cb737624361b6f7bed227d68ccf" exitCode=0 Nov 25 16:17:46 crc kubenswrapper[4743]: I1125 16:17:46.909408 4743 generic.go:334] "Generic (PLEG): container finished" podID="c32a467a-c44a-40ef-a9a3-16d6e61a4cd8" containerID="ea7b9050d28f76913ed959c917818132aef1c9c135892f603456594f85a036b1" exitCode=143 Nov 25 16:17:46 crc kubenswrapper[4743]: I1125 16:17:46.908561 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8","Type":"ContainerDied","Data":"7dd8c8a412d18cdf20ff28ebfe5b7e4f6ca76cb737624361b6f7bed227d68ccf"} Nov 25 16:17:46 crc kubenswrapper[4743]: I1125 16:17:46.909755 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8","Type":"ContainerDied","Data":"ea7b9050d28f76913ed959c917818132aef1c9c135892f603456594f85a036b1"} Nov 25 16:17:46 crc kubenswrapper[4743]: I1125 16:17:46.909771 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8","Type":"ContainerDied","Data":"1f7bc345ce0523ec325bbdebc53cde0e39ef9978d5b68c30a81eabae309b4d42"} Nov 25 16:17:46 crc kubenswrapper[4743]: I1125 16:17:46.909786 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f7bc345ce0523ec325bbdebc53cde0e39ef9978d5b68c30a81eabae309b4d42" Nov 25 16:17:46 crc kubenswrapper[4743]: I1125 16:17:46.980133 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 16:17:46 crc kubenswrapper[4743]: I1125 16:17:46.981309 4743 scope.go:117] "RemoveContainer" containerID="5bdec32ad26d7b6526897b537449d6c1ff4087b3ef59178ff5968dfe2e7c76e7" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.015386 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2gxv\" (UniqueName: \"kubernetes.io/projected/c08dfadc-c16a-4c82-be2d-318ab9aae386-kube-api-access-d2gxv\") pod \"c08dfadc-c16a-4c82-be2d-318ab9aae386\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.015437 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-sg-core-conf-yaml\") pod \"c08dfadc-c16a-4c82-be2d-318ab9aae386\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.015470 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c08dfadc-c16a-4c82-be2d-318ab9aae386-log-httpd\") pod \"c08dfadc-c16a-4c82-be2d-318ab9aae386\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.015488 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-logs\") pod \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.015535 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-combined-ca-bundle\") pod \"c08dfadc-c16a-4c82-be2d-318ab9aae386\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.015552 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-scripts\") pod \"c08dfadc-c16a-4c82-be2d-318ab9aae386\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.015626 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-combined-ca-bundle\") pod \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.015650 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-config-data\") pod \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.015679 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c08dfadc-c16a-4c82-be2d-318ab9aae386-run-httpd\") pod \"c08dfadc-c16a-4c82-be2d-318ab9aae386\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.015727 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22ltg\" (UniqueName: \"kubernetes.io/projected/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-kube-api-access-22ltg\") pod \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\" (UID: \"c32a467a-c44a-40ef-a9a3-16d6e61a4cd8\") " Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.015755 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-config-data\") pod \"c08dfadc-c16a-4c82-be2d-318ab9aae386\" (UID: \"c08dfadc-c16a-4c82-be2d-318ab9aae386\") " Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.018169 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c08dfadc-c16a-4c82-be2d-318ab9aae386-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c08dfadc-c16a-4c82-be2d-318ab9aae386" (UID: "c08dfadc-c16a-4c82-be2d-318ab9aae386"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.020986 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c08dfadc-c16a-4c82-be2d-318ab9aae386-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c08dfadc-c16a-4c82-be2d-318ab9aae386" (UID: "c08dfadc-c16a-4c82-be2d-318ab9aae386"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.025486 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-logs" (OuterVolumeSpecName: "logs") pod "c32a467a-c44a-40ef-a9a3-16d6e61a4cd8" (UID: "c32a467a-c44a-40ef-a9a3-16d6e61a4cd8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.027769 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-kube-api-access-22ltg" (OuterVolumeSpecName: "kube-api-access-22ltg") pod "c32a467a-c44a-40ef-a9a3-16d6e61a4cd8" (UID: "c32a467a-c44a-40ef-a9a3-16d6e61a4cd8"). InnerVolumeSpecName "kube-api-access-22ltg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.033853 4743 scope.go:117] "RemoveContainer" containerID="9998c2d80b6bf7df93e64057998fb73f7398879fbbc782dc3bcba76223c9278e" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.041143 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-scripts" (OuterVolumeSpecName: "scripts") pod "c08dfadc-c16a-4c82-be2d-318ab9aae386" (UID: "c08dfadc-c16a-4c82-be2d-318ab9aae386"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.041223 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c08dfadc-c16a-4c82-be2d-318ab9aae386-kube-api-access-d2gxv" (OuterVolumeSpecName: "kube-api-access-d2gxv") pod "c08dfadc-c16a-4c82-be2d-318ab9aae386" (UID: "c08dfadc-c16a-4c82-be2d-318ab9aae386"). InnerVolumeSpecName "kube-api-access-d2gxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.058762 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c32a467a-c44a-40ef-a9a3-16d6e61a4cd8" (UID: "c32a467a-c44a-40ef-a9a3-16d6e61a4cd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.065826 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c08dfadc-c16a-4c82-be2d-318ab9aae386" (UID: "c08dfadc-c16a-4c82-be2d-318ab9aae386"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.080664 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-config-data" (OuterVolumeSpecName: "config-data") pod "c32a467a-c44a-40ef-a9a3-16d6e61a4cd8" (UID: "c32a467a-c44a-40ef-a9a3-16d6e61a4cd8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.082732 4743 scope.go:117] "RemoveContainer" containerID="cef64f60cca8550dcdc59cb1865d390f2047e35342c7cab795b3c9eaa6ff1f7a" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.105508 4743 scope.go:117] "RemoveContainer" containerID="08d01dab6fdc4f4476fa863d86caf7a2f0f0b1b9f755b31b7550b7f1ae57ac2a" Nov 25 16:17:47 crc kubenswrapper[4743]: E1125 16:17:47.105988 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08d01dab6fdc4f4476fa863d86caf7a2f0f0b1b9f755b31b7550b7f1ae57ac2a\": container with ID starting with 08d01dab6fdc4f4476fa863d86caf7a2f0f0b1b9f755b31b7550b7f1ae57ac2a not found: ID does not exist" containerID="08d01dab6fdc4f4476fa863d86caf7a2f0f0b1b9f755b31b7550b7f1ae57ac2a" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.106036 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08d01dab6fdc4f4476fa863d86caf7a2f0f0b1b9f755b31b7550b7f1ae57ac2a"} err="failed to get container status \"08d01dab6fdc4f4476fa863d86caf7a2f0f0b1b9f755b31b7550b7f1ae57ac2a\": rpc error: code = NotFound desc = could not find container \"08d01dab6fdc4f4476fa863d86caf7a2f0f0b1b9f755b31b7550b7f1ae57ac2a\": container with ID starting with 08d01dab6fdc4f4476fa863d86caf7a2f0f0b1b9f755b31b7550b7f1ae57ac2a not found: ID does not exist" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.106062 4743 scope.go:117] "RemoveContainer" containerID="5bdec32ad26d7b6526897b537449d6c1ff4087b3ef59178ff5968dfe2e7c76e7" Nov 25 16:17:47 crc kubenswrapper[4743]: E1125 16:17:47.106466 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bdec32ad26d7b6526897b537449d6c1ff4087b3ef59178ff5968dfe2e7c76e7\": container with ID starting with 5bdec32ad26d7b6526897b537449d6c1ff4087b3ef59178ff5968dfe2e7c76e7 not found: ID does not exist" containerID="5bdec32ad26d7b6526897b537449d6c1ff4087b3ef59178ff5968dfe2e7c76e7" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.106487 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bdec32ad26d7b6526897b537449d6c1ff4087b3ef59178ff5968dfe2e7c76e7"} err="failed to get container status \"5bdec32ad26d7b6526897b537449d6c1ff4087b3ef59178ff5968dfe2e7c76e7\": rpc error: code = NotFound desc = could not find container \"5bdec32ad26d7b6526897b537449d6c1ff4087b3ef59178ff5968dfe2e7c76e7\": container with ID starting with 5bdec32ad26d7b6526897b537449d6c1ff4087b3ef59178ff5968dfe2e7c76e7 not found: ID does not exist" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.106500 4743 scope.go:117] "RemoveContainer" containerID="9998c2d80b6bf7df93e64057998fb73f7398879fbbc782dc3bcba76223c9278e" Nov 25 16:17:47 crc kubenswrapper[4743]: E1125 16:17:47.106767 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9998c2d80b6bf7df93e64057998fb73f7398879fbbc782dc3bcba76223c9278e\": container with ID starting with 9998c2d80b6bf7df93e64057998fb73f7398879fbbc782dc3bcba76223c9278e not found: ID does not exist" containerID="9998c2d80b6bf7df93e64057998fb73f7398879fbbc782dc3bcba76223c9278e" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.106811 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9998c2d80b6bf7df93e64057998fb73f7398879fbbc782dc3bcba76223c9278e"} err="failed to get container status \"9998c2d80b6bf7df93e64057998fb73f7398879fbbc782dc3bcba76223c9278e\": rpc error: code = NotFound desc = could not find container \"9998c2d80b6bf7df93e64057998fb73f7398879fbbc782dc3bcba76223c9278e\": container with ID starting with 9998c2d80b6bf7df93e64057998fb73f7398879fbbc782dc3bcba76223c9278e not found: ID does not exist" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.106840 4743 scope.go:117] "RemoveContainer" containerID="cef64f60cca8550dcdc59cb1865d390f2047e35342c7cab795b3c9eaa6ff1f7a" Nov 25 16:17:47 crc kubenswrapper[4743]: E1125 16:17:47.107146 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef64f60cca8550dcdc59cb1865d390f2047e35342c7cab795b3c9eaa6ff1f7a\": container with ID starting with cef64f60cca8550dcdc59cb1865d390f2047e35342c7cab795b3c9eaa6ff1f7a not found: ID does not exist" containerID="cef64f60cca8550dcdc59cb1865d390f2047e35342c7cab795b3c9eaa6ff1f7a" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.107170 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef64f60cca8550dcdc59cb1865d390f2047e35342c7cab795b3c9eaa6ff1f7a"} err="failed to get container status \"cef64f60cca8550dcdc59cb1865d390f2047e35342c7cab795b3c9eaa6ff1f7a\": rpc error: code = NotFound desc = could not find container \"cef64f60cca8550dcdc59cb1865d390f2047e35342c7cab795b3c9eaa6ff1f7a\": container with ID starting with cef64f60cca8550dcdc59cb1865d390f2047e35342c7cab795b3c9eaa6ff1f7a not found: ID does not exist" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.117734 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c08dfadc-c16a-4c82-be2d-318ab9aae386-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.117761 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.117774 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.117788 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.117803 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.117815 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c08dfadc-c16a-4c82-be2d-318ab9aae386-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.117826 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22ltg\" (UniqueName: \"kubernetes.io/projected/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8-kube-api-access-22ltg\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.117838 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2gxv\" (UniqueName: \"kubernetes.io/projected/c08dfadc-c16a-4c82-be2d-318ab9aae386-kube-api-access-d2gxv\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.117849 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.119928 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c08dfadc-c16a-4c82-be2d-318ab9aae386" (UID: "c08dfadc-c16a-4c82-be2d-318ab9aae386"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.142806 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-config-data" (OuterVolumeSpecName: "config-data") pod "c08dfadc-c16a-4c82-be2d-318ab9aae386" (UID: "c08dfadc-c16a-4c82-be2d-318ab9aae386"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.219111 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.219139 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c08dfadc-c16a-4c82-be2d-318ab9aae386-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.240001 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.254424 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.264511 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:17:47 crc kubenswrapper[4743]: E1125 16:17:47.264936 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="ceilometer-notification-agent" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.264954 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="ceilometer-notification-agent" Nov 25 16:17:47 crc kubenswrapper[4743]: E1125 16:17:47.264964 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32a467a-c44a-40ef-a9a3-16d6e61a4cd8" containerName="nova-metadata-metadata" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.264970 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32a467a-c44a-40ef-a9a3-16d6e61a4cd8" containerName="nova-metadata-metadata" Nov 25 16:17:47 crc kubenswrapper[4743]: E1125 16:17:47.264985 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="proxy-httpd" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.264992 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="proxy-httpd" Nov 25 16:17:47 crc kubenswrapper[4743]: E1125 16:17:47.265001 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="ceilometer-central-agent" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.265006 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="ceilometer-central-agent" Nov 25 16:17:47 crc kubenswrapper[4743]: E1125 16:17:47.265035 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="sg-core" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.265041 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="sg-core" Nov 25 16:17:47 crc kubenswrapper[4743]: E1125 16:17:47.265058 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32a467a-c44a-40ef-a9a3-16d6e61a4cd8" containerName="nova-metadata-log" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.265064 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32a467a-c44a-40ef-a9a3-16d6e61a4cd8" containerName="nova-metadata-log" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.265224 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32a467a-c44a-40ef-a9a3-16d6e61a4cd8" containerName="nova-metadata-metadata" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.265240 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="ceilometer-notification-agent" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.265249 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="sg-core" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.265259 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32a467a-c44a-40ef-a9a3-16d6e61a4cd8" containerName="nova-metadata-log" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.265279 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="proxy-httpd" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.265289 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" containerName="ceilometer-central-agent" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.266954 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.269069 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.269278 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.275485 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.320901 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53189231-e2dd-4b95-aec8-0f07fec6c495-log-httpd\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.320941 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7759l\" (UniqueName: \"kubernetes.io/projected/53189231-e2dd-4b95-aec8-0f07fec6c495-kube-api-access-7759l\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.321190 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.321256 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53189231-e2dd-4b95-aec8-0f07fec6c495-run-httpd\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.321294 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-config-data\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.321309 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.321453 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-scripts\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.422017 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53189231-e2dd-4b95-aec8-0f07fec6c495-log-httpd\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.422068 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7759l\" (UniqueName: \"kubernetes.io/projected/53189231-e2dd-4b95-aec8-0f07fec6c495-kube-api-access-7759l\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.422149 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.422499 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53189231-e2dd-4b95-aec8-0f07fec6c495-run-httpd\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.422532 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53189231-e2dd-4b95-aec8-0f07fec6c495-log-httpd\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.422948 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53189231-e2dd-4b95-aec8-0f07fec6c495-run-httpd\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.422537 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-config-data\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.423021 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.423076 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-scripts\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.426295 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.426774 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.428356 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-scripts\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.428668 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-config-data\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.440283 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7759l\" (UniqueName: \"kubernetes.io/projected/53189231-e2dd-4b95-aec8-0f07fec6c495-kube-api-access-7759l\") pod \"ceilometer-0\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.589630 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.793830 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c08dfadc-c16a-4c82-be2d-318ab9aae386" path="/var/lib/kubelet/pods/c08dfadc-c16a-4c82-be2d-318ab9aae386/volumes" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.918619 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.948621 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.958756 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.971324 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.973816 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.978196 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 16:17:47 crc kubenswrapper[4743]: I1125 16:17:47.979298 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.000013 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.057011 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:17:48 crc kubenswrapper[4743]: E1125 16:17:48.077298 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc32a467a_c44a_40ef_a9a3_16d6e61a4cd8.slice/crio-1f7bc345ce0523ec325bbdebc53cde0e39ef9978d5b68c30a81eabae309b4d42\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc32a467a_c44a_40ef_a9a3_16d6e61a4cd8.slice\": RecentStats: unable to find data in memory cache]" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.136415 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-config-data\") pod \"nova-metadata-0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.136987 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.137041 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.137077 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249a955d-fd8a-4a59-99e6-8f5b2c809af0-logs\") pod \"nova-metadata-0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.137106 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tmfr\" (UniqueName: \"kubernetes.io/projected/249a955d-fd8a-4a59-99e6-8f5b2c809af0-kube-api-access-4tmfr\") pod \"nova-metadata-0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.238528 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.238662 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.238689 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249a955d-fd8a-4a59-99e6-8f5b2c809af0-logs\") pod \"nova-metadata-0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.238710 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tmfr\" (UniqueName: \"kubernetes.io/projected/249a955d-fd8a-4a59-99e6-8f5b2c809af0-kube-api-access-4tmfr\") pod \"nova-metadata-0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.238796 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-config-data\") pod \"nova-metadata-0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.239269 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249a955d-fd8a-4a59-99e6-8f5b2c809af0-logs\") pod \"nova-metadata-0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.244584 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.244816 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.244924 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-config-data\") pod \"nova-metadata-0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.254098 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tmfr\" (UniqueName: \"kubernetes.io/projected/249a955d-fd8a-4a59-99e6-8f5b2c809af0-kube-api-access-4tmfr\") pod \"nova-metadata-0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.303256 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.729147 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:17:48 crc kubenswrapper[4743]: W1125 16:17:48.736138 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod249a955d_fd8a_4a59_99e6_8f5b2c809af0.slice/crio-317b00bc06735c5f6c93d35fffafe3a687ebe385043cab2e063e230863ea8174 WatchSource:0}: Error finding container 317b00bc06735c5f6c93d35fffafe3a687ebe385043cab2e063e230863ea8174: Status 404 returned error can't find the container with id 317b00bc06735c5f6c93d35fffafe3a687ebe385043cab2e063e230863ea8174 Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.928014 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"249a955d-fd8a-4a59-99e6-8f5b2c809af0","Type":"ContainerStarted","Data":"317b00bc06735c5f6c93d35fffafe3a687ebe385043cab2e063e230863ea8174"} Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.929983 4743 generic.go:334] "Generic (PLEG): container finished" podID="451a339d-6ea1-4ce0-a550-fcaad7d83f28" containerID="87336b290dd47e553fbd823716fa8fbfcb4d48cad500d15d9e16be29952f724e" exitCode=0 Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.930064 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pccxd" event={"ID":"451a339d-6ea1-4ce0-a550-fcaad7d83f28","Type":"ContainerDied","Data":"87336b290dd47e553fbd823716fa8fbfcb4d48cad500d15d9e16be29952f724e"} Nov 25 16:17:48 crc kubenswrapper[4743]: I1125 16:17:48.931762 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53189231-e2dd-4b95-aec8-0f07fec6c495","Type":"ContainerStarted","Data":"21243c1d1bff8a2a071c45731e190e8578207e3873a8cc8448839268fc7f6a79"} Nov 25 16:17:49 crc kubenswrapper[4743]: I1125 16:17:49.785112 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c32a467a-c44a-40ef-a9a3-16d6e61a4cd8" path="/var/lib/kubelet/pods/c32a467a-c44a-40ef-a9a3-16d6e61a4cd8/volumes" Nov 25 16:17:49 crc kubenswrapper[4743]: I1125 16:17:49.943405 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53189231-e2dd-4b95-aec8-0f07fec6c495","Type":"ContainerStarted","Data":"4394a5b6d239405f07da4e592c51984b5a9db79804bca4b927be7b1cf4ff194d"} Nov 25 16:17:49 crc kubenswrapper[4743]: I1125 16:17:49.948056 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"249a955d-fd8a-4a59-99e6-8f5b2c809af0","Type":"ContainerStarted","Data":"dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8"} Nov 25 16:17:49 crc kubenswrapper[4743]: I1125 16:17:49.948188 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"249a955d-fd8a-4a59-99e6-8f5b2c809af0","Type":"ContainerStarted","Data":"d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268"} Nov 25 16:17:49 crc kubenswrapper[4743]: I1125 16:17:49.975372 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.975351612 podStartE2EDuration="2.975351612s" podCreationTimestamp="2025-11-25 16:17:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:17:49.969627533 +0000 UTC m=+1149.091467082" watchObservedRunningTime="2025-11-25 16:17:49.975351612 +0000 UTC m=+1149.097191181" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.077328 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.077379 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.077424 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.078317 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22876c3200d1bd282f05d310d56d80b6ce637f5b7335a83f68f3eb1b6ac3ce7a"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.078386 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://22876c3200d1bd282f05d310d56d80b6ce637f5b7335a83f68f3eb1b6ac3ce7a" gracePeriod=600 Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.167422 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.167828 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.192782 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.299924 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.383400 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-scripts\") pod \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.383460 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-config-data\") pod \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.383483 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tmb2\" (UniqueName: \"kubernetes.io/projected/451a339d-6ea1-4ce0-a550-fcaad7d83f28-kube-api-access-7tmb2\") pod \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.383561 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-combined-ca-bundle\") pod \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\" (UID: \"451a339d-6ea1-4ce0-a550-fcaad7d83f28\") " Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.383683 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.383729 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.388448 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-scripts" (OuterVolumeSpecName: "scripts") pod "451a339d-6ea1-4ce0-a550-fcaad7d83f28" (UID: "451a339d-6ea1-4ce0-a550-fcaad7d83f28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.389317 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/451a339d-6ea1-4ce0-a550-fcaad7d83f28-kube-api-access-7tmb2" (OuterVolumeSpecName: "kube-api-access-7tmb2") pod "451a339d-6ea1-4ce0-a550-fcaad7d83f28" (UID: "451a339d-6ea1-4ce0-a550-fcaad7d83f28"). InnerVolumeSpecName "kube-api-access-7tmb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.397726 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.409969 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "451a339d-6ea1-4ce0-a550-fcaad7d83f28" (UID: "451a339d-6ea1-4ce0-a550-fcaad7d83f28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.413452 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.445872 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-config-data" (OuterVolumeSpecName: "config-data") pod "451a339d-6ea1-4ce0-a550-fcaad7d83f28" (UID: "451a339d-6ea1-4ce0-a550-fcaad7d83f28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.477225 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-p6wg7"] Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.477469 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" podUID="f402dd15-78ae-4695-a91b-0cf339b16c76" containerName="dnsmasq-dns" containerID="cri-o://f78541f79cdc42aa719c3782039125f449e272a16e70e225e6a7b631a21b6619" gracePeriod=10 Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.487264 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.487302 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tmb2\" (UniqueName: \"kubernetes.io/projected/451a339d-6ea1-4ce0-a550-fcaad7d83f28-kube-api-access-7tmb2\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.487319 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.487332 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/451a339d-6ea1-4ce0-a550-fcaad7d83f28-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.959869 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53189231-e2dd-4b95-aec8-0f07fec6c495","Type":"ContainerStarted","Data":"6ec3aff340c586ec6b7971e16093e4df4cba00b97a603250194bbdb47ca6ace4"} Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.966155 4743 generic.go:334] "Generic (PLEG): container finished" podID="f402dd15-78ae-4695-a91b-0cf339b16c76" containerID="f78541f79cdc42aa719c3782039125f449e272a16e70e225e6a7b631a21b6619" exitCode=0 Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.966240 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" event={"ID":"f402dd15-78ae-4695-a91b-0cf339b16c76","Type":"ContainerDied","Data":"f78541f79cdc42aa719c3782039125f449e272a16e70e225e6a7b631a21b6619"} Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.968146 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pccxd" event={"ID":"451a339d-6ea1-4ce0-a550-fcaad7d83f28","Type":"ContainerDied","Data":"4522f204e37a9737e97b7d3db2c37067896984d7886f779f8f0ad2053dd360cb"} Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.968200 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4522f204e37a9737e97b7d3db2c37067896984d7886f779f8f0ad2053dd360cb" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.968292 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pccxd" Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.971896 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="22876c3200d1bd282f05d310d56d80b6ce637f5b7335a83f68f3eb1b6ac3ce7a" exitCode=0 Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.973008 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"22876c3200d1bd282f05d310d56d80b6ce637f5b7335a83f68f3eb1b6ac3ce7a"} Nov 25 16:17:50 crc kubenswrapper[4743]: I1125 16:17:50.973044 4743 scope.go:117] "RemoveContainer" containerID="3891bef80e07425c4fd47953c65e853bb10f31ef01a35da0d32440f2ed3b5e2c" Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.005302 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.039719 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" podUID="f402dd15-78ae-4695-a91b-0cf339b16c76" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.163:5353: connect: connection refused" Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.093759 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.093977 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7cfa1f46-e909-4131-97c9-a8edde9e3a21" containerName="nova-api-log" containerID="cri-o://0a580f72fc8cc629f46a7b419f45c9b4fddca319d2891593b286cab4c8c8248f" gracePeriod=30 Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.094099 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7cfa1f46-e909-4131-97c9-a8edde9e3a21" containerName="nova-api-api" containerID="cri-o://d9e46b692cb2bd8c6e5be1cdda753ba5eb2d84f88edc02f0d58e44fb852d26d3" gracePeriod=30 Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.104846 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7cfa1f46-e909-4131-97c9-a8edde9e3a21" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.182:8774/\": EOF" Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.104858 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7cfa1f46-e909-4131-97c9-a8edde9e3a21" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.182:8774/\": EOF" Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.148853 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.463959 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.828685 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.915657 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns968\" (UniqueName: \"kubernetes.io/projected/f402dd15-78ae-4695-a91b-0cf339b16c76-kube-api-access-ns968\") pod \"f402dd15-78ae-4695-a91b-0cf339b16c76\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.916142 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-config\") pod \"f402dd15-78ae-4695-a91b-0cf339b16c76\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.916172 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-dns-swift-storage-0\") pod \"f402dd15-78ae-4695-a91b-0cf339b16c76\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.916200 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-ovsdbserver-sb\") pod \"f402dd15-78ae-4695-a91b-0cf339b16c76\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.916264 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-dns-svc\") pod \"f402dd15-78ae-4695-a91b-0cf339b16c76\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.916379 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-ovsdbserver-nb\") pod \"f402dd15-78ae-4695-a91b-0cf339b16c76\" (UID: \"f402dd15-78ae-4695-a91b-0cf339b16c76\") " Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.929721 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f402dd15-78ae-4695-a91b-0cf339b16c76-kube-api-access-ns968" (OuterVolumeSpecName: "kube-api-access-ns968") pod "f402dd15-78ae-4695-a91b-0cf339b16c76" (UID: "f402dd15-78ae-4695-a91b-0cf339b16c76"). InnerVolumeSpecName "kube-api-access-ns968". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:51 crc kubenswrapper[4743]: I1125 16:17:51.989469 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"c7a1e69a5b625582ce315759353f5fdddaa5af76ffaa857f59e8fe101fdc1d28"} Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.010264 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" event={"ID":"f402dd15-78ae-4695-a91b-0cf339b16c76","Type":"ContainerDied","Data":"d5a561bf4fef0d4f43d96dad6ca09b86882798dd4df3a86fd0a5b514405d16e5"} Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.010325 4743 scope.go:117] "RemoveContainer" containerID="f78541f79cdc42aa719c3782039125f449e272a16e70e225e6a7b631a21b6619" Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.010456 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-p6wg7" Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.025078 4743 generic.go:334] "Generic (PLEG): container finished" podID="7cfa1f46-e909-4131-97c9-a8edde9e3a21" containerID="0a580f72fc8cc629f46a7b419f45c9b4fddca319d2891593b286cab4c8c8248f" exitCode=143 Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.025243 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cfa1f46-e909-4131-97c9-a8edde9e3a21","Type":"ContainerDied","Data":"0a580f72fc8cc629f46a7b419f45c9b4fddca319d2891593b286cab4c8c8248f"} Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.027354 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f402dd15-78ae-4695-a91b-0cf339b16c76" (UID: "f402dd15-78ae-4695-a91b-0cf339b16c76"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.029674 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.030061 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns968\" (UniqueName: \"kubernetes.io/projected/f402dd15-78ae-4695-a91b-0cf339b16c76-kube-api-access-ns968\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.043184 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-config" (OuterVolumeSpecName: "config") pod "f402dd15-78ae-4695-a91b-0cf339b16c76" (UID: "f402dd15-78ae-4695-a91b-0cf339b16c76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.051140 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f402dd15-78ae-4695-a91b-0cf339b16c76" (UID: "f402dd15-78ae-4695-a91b-0cf339b16c76"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.055325 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f402dd15-78ae-4695-a91b-0cf339b16c76" (UID: "f402dd15-78ae-4695-a91b-0cf339b16c76"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.085656 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f402dd15-78ae-4695-a91b-0cf339b16c76" (UID: "f402dd15-78ae-4695-a91b-0cf339b16c76"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.131822 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.131862 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.131874 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.131883 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f402dd15-78ae-4695-a91b-0cf339b16c76-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.178243 4743 scope.go:117] "RemoveContainer" containerID="45f0c5c98959884ee5c801b099ed5bff8e3a3cf8055561f103641ff96be46cb5" Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.350661 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-p6wg7"] Nov 25 16:17:52 crc kubenswrapper[4743]: I1125 16:17:52.356309 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-p6wg7"] Nov 25 16:17:53 crc kubenswrapper[4743]: I1125 16:17:53.034927 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6" containerName="nova-scheduler-scheduler" containerID="cri-o://02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea" gracePeriod=30 Nov 25 16:17:53 crc kubenswrapper[4743]: I1125 16:17:53.035105 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="249a955d-fd8a-4a59-99e6-8f5b2c809af0" containerName="nova-metadata-log" containerID="cri-o://d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268" gracePeriod=30 Nov 25 16:17:53 crc kubenswrapper[4743]: I1125 16:17:53.035266 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="249a955d-fd8a-4a59-99e6-8f5b2c809af0" containerName="nova-metadata-metadata" containerID="cri-o://dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8" gracePeriod=30 Nov 25 16:17:53 crc kubenswrapper[4743]: I1125 16:17:53.303866 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 16:17:53 crc kubenswrapper[4743]: I1125 16:17:53.304286 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 16:17:53 crc kubenswrapper[4743]: I1125 16:17:53.787272 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f402dd15-78ae-4695-a91b-0cf339b16c76" path="/var/lib/kubelet/pods/f402dd15-78ae-4695-a91b-0cf339b16c76/volumes" Nov 25 16:17:53 crc kubenswrapper[4743]: I1125 16:17:53.856453 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 16:17:53 crc kubenswrapper[4743]: I1125 16:17:53.969347 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tmfr\" (UniqueName: \"kubernetes.io/projected/249a955d-fd8a-4a59-99e6-8f5b2c809af0-kube-api-access-4tmfr\") pod \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " Nov 25 16:17:53 crc kubenswrapper[4743]: I1125 16:17:53.969477 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-combined-ca-bundle\") pod \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " Nov 25 16:17:53 crc kubenswrapper[4743]: I1125 16:17:53.969556 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249a955d-fd8a-4a59-99e6-8f5b2c809af0-logs\") pod \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " Nov 25 16:17:53 crc kubenswrapper[4743]: I1125 16:17:53.969622 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-config-data\") pod \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " Nov 25 16:17:53 crc kubenswrapper[4743]: I1125 16:17:53.969637 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-nova-metadata-tls-certs\") pod \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\" (UID: \"249a955d-fd8a-4a59-99e6-8f5b2c809af0\") " Nov 25 16:17:53 crc kubenswrapper[4743]: I1125 16:17:53.969966 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/249a955d-fd8a-4a59-99e6-8f5b2c809af0-logs" (OuterVolumeSpecName: "logs") pod "249a955d-fd8a-4a59-99e6-8f5b2c809af0" (UID: "249a955d-fd8a-4a59-99e6-8f5b2c809af0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:17:53 crc kubenswrapper[4743]: I1125 16:17:53.970183 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249a955d-fd8a-4a59-99e6-8f5b2c809af0-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:53 crc kubenswrapper[4743]: I1125 16:17:53.991775 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/249a955d-fd8a-4a59-99e6-8f5b2c809af0-kube-api-access-4tmfr" (OuterVolumeSpecName: "kube-api-access-4tmfr") pod "249a955d-fd8a-4a59-99e6-8f5b2c809af0" (UID: "249a955d-fd8a-4a59-99e6-8f5b2c809af0"). InnerVolumeSpecName "kube-api-access-4tmfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.012064 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-config-data" (OuterVolumeSpecName: "config-data") pod "249a955d-fd8a-4a59-99e6-8f5b2c809af0" (UID: "249a955d-fd8a-4a59-99e6-8f5b2c809af0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.020791 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "249a955d-fd8a-4a59-99e6-8f5b2c809af0" (UID: "249a955d-fd8a-4a59-99e6-8f5b2c809af0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.038380 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "249a955d-fd8a-4a59-99e6-8f5b2c809af0" (UID: "249a955d-fd8a-4a59-99e6-8f5b2c809af0"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.058888 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53189231-e2dd-4b95-aec8-0f07fec6c495","Type":"ContainerStarted","Data":"fbd38eb8cbd146c06de2de7a954ed041d2de374f3f45e4783bbec0a43f15cc0f"} Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.060889 4743 generic.go:334] "Generic (PLEG): container finished" podID="249a955d-fd8a-4a59-99e6-8f5b2c809af0" containerID="dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8" exitCode=0 Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.060922 4743 generic.go:334] "Generic (PLEG): container finished" podID="249a955d-fd8a-4a59-99e6-8f5b2c809af0" containerID="d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268" exitCode=143 Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.060953 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"249a955d-fd8a-4a59-99e6-8f5b2c809af0","Type":"ContainerDied","Data":"dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8"} Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.060980 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"249a955d-fd8a-4a59-99e6-8f5b2c809af0","Type":"ContainerDied","Data":"d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268"} Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.060992 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"249a955d-fd8a-4a59-99e6-8f5b2c809af0","Type":"ContainerDied","Data":"317b00bc06735c5f6c93d35fffafe3a687ebe385043cab2e063e230863ea8174"} Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.061010 4743 scope.go:117] "RemoveContainer" containerID="dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.061323 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.073231 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.073584 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.073612 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a955d-fd8a-4a59-99e6-8f5b2c809af0-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.073624 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tmfr\" (UniqueName: \"kubernetes.io/projected/249a955d-fd8a-4a59-99e6-8f5b2c809af0-kube-api-access-4tmfr\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.172234 4743 scope.go:117] "RemoveContainer" containerID="d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.176272 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.189623 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.198781 4743 scope.go:117] "RemoveContainer" containerID="dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8" Nov 25 16:17:54 crc kubenswrapper[4743]: E1125 16:17:54.206681 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8\": container with ID starting with dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8 not found: ID does not exist" containerID="dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.206758 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8"} err="failed to get container status \"dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8\": rpc error: code = NotFound desc = could not find container \"dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8\": container with ID starting with dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8 not found: ID does not exist" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.206794 4743 scope.go:117] "RemoveContainer" containerID="d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268" Nov 25 16:17:54 crc kubenswrapper[4743]: E1125 16:17:54.207464 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268\": container with ID starting with d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268 not found: ID does not exist" containerID="d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.207524 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268"} err="failed to get container status \"d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268\": rpc error: code = NotFound desc = could not find container \"d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268\": container with ID starting with d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268 not found: ID does not exist" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.207560 4743 scope.go:117] "RemoveContainer" containerID="dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.207848 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8"} err="failed to get container status \"dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8\": rpc error: code = NotFound desc = could not find container \"dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8\": container with ID starting with dda18fe1f194b18814223d6e9c0e7b70ba982caf031ab220184d47170ccf4cd8 not found: ID does not exist" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.207872 4743 scope.go:117] "RemoveContainer" containerID="d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.208036 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268"} err="failed to get container status \"d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268\": rpc error: code = NotFound desc = could not find container \"d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268\": container with ID starting with d5bd1dac12ca98c79cd957c884e4c7e801672d3f67109c007f461587273b4268 not found: ID does not exist" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.208431 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:17:54 crc kubenswrapper[4743]: E1125 16:17:54.209137 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451a339d-6ea1-4ce0-a550-fcaad7d83f28" containerName="nova-manage" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.209199 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="451a339d-6ea1-4ce0-a550-fcaad7d83f28" containerName="nova-manage" Nov 25 16:17:54 crc kubenswrapper[4743]: E1125 16:17:54.209226 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f402dd15-78ae-4695-a91b-0cf339b16c76" containerName="init" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.209234 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f402dd15-78ae-4695-a91b-0cf339b16c76" containerName="init" Nov 25 16:17:54 crc kubenswrapper[4743]: E1125 16:17:54.209250 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249a955d-fd8a-4a59-99e6-8f5b2c809af0" containerName="nova-metadata-metadata" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.209285 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="249a955d-fd8a-4a59-99e6-8f5b2c809af0" containerName="nova-metadata-metadata" Nov 25 16:17:54 crc kubenswrapper[4743]: E1125 16:17:54.209304 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f402dd15-78ae-4695-a91b-0cf339b16c76" containerName="dnsmasq-dns" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.209312 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f402dd15-78ae-4695-a91b-0cf339b16c76" containerName="dnsmasq-dns" Nov 25 16:17:54 crc kubenswrapper[4743]: E1125 16:17:54.209374 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249a955d-fd8a-4a59-99e6-8f5b2c809af0" containerName="nova-metadata-log" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.209385 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="249a955d-fd8a-4a59-99e6-8f5b2c809af0" containerName="nova-metadata-log" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.211836 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="451a339d-6ea1-4ce0-a550-fcaad7d83f28" containerName="nova-manage" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.211907 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="249a955d-fd8a-4a59-99e6-8f5b2c809af0" containerName="nova-metadata-log" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.211935 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f402dd15-78ae-4695-a91b-0cf339b16c76" containerName="dnsmasq-dns" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.211984 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="249a955d-fd8a-4a59-99e6-8f5b2c809af0" containerName="nova-metadata-metadata" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.214225 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.218364 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.219140 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.219991 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.276862 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-logs\") pod \"nova-metadata-0\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.276921 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.276945 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.276976 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-config-data\") pod \"nova-metadata-0\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.276998 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvjgd\" (UniqueName: \"kubernetes.io/projected/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-kube-api-access-vvjgd\") pod \"nova-metadata-0\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.379319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-logs\") pod \"nova-metadata-0\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.379387 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.379408 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.379439 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-config-data\") pod \"nova-metadata-0\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.379460 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvjgd\" (UniqueName: \"kubernetes.io/projected/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-kube-api-access-vvjgd\") pod \"nova-metadata-0\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.380326 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-logs\") pod \"nova-metadata-0\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.383739 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.389254 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.389377 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-config-data\") pod \"nova-metadata-0\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.398750 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvjgd\" (UniqueName: \"kubernetes.io/projected/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-kube-api-access-vvjgd\") pod \"nova-metadata-0\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " pod="openstack/nova-metadata-0" Nov 25 16:17:54 crc kubenswrapper[4743]: I1125 16:17:54.535069 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 16:17:55 crc kubenswrapper[4743]: I1125 16:17:55.049403 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:17:55 crc kubenswrapper[4743]: I1125 16:17:55.071182 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6","Type":"ContainerStarted","Data":"deaaf9e8f2a2c8a58491b1dfa091cfb9d621b393209c7639a80c6c719c844a87"} Nov 25 16:17:55 crc kubenswrapper[4743]: E1125 16:17:55.385494 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 25 16:17:55 crc kubenswrapper[4743]: E1125 16:17:55.387526 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 25 16:17:55 crc kubenswrapper[4743]: E1125 16:17:55.388797 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 25 16:17:55 crc kubenswrapper[4743]: E1125 16:17:55.388860 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6" containerName="nova-scheduler-scheduler" Nov 25 16:17:55 crc kubenswrapper[4743]: I1125 16:17:55.785512 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="249a955d-fd8a-4a59-99e6-8f5b2c809af0" path="/var/lib/kubelet/pods/249a955d-fd8a-4a59-99e6-8f5b2c809af0/volumes" Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.086788 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53189231-e2dd-4b95-aec8-0f07fec6c495","Type":"ContainerStarted","Data":"eaa4dc3e96f54f6a0e7f3832e5c71a8fde266fdf0189bdd829aa2cc54c5fabc4"} Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.087201 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.088954 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6","Type":"ContainerStarted","Data":"344b618bfb5b5cc568e87479f8d51655cba2f1f945a544d7f62c20acc79527ab"} Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.088996 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6","Type":"ContainerStarted","Data":"33ead0ca9162468c43f78ce7d90b8500bd26be949d79154b77a9e94103ae7de3"} Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.108961 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.109954637 podStartE2EDuration="9.108942863s" podCreationTimestamp="2025-11-25 16:17:47 +0000 UTC" firstStartedPulling="2025-11-25 16:17:48.062680481 +0000 UTC m=+1147.184520030" lastFinishedPulling="2025-11-25 16:17:55.061668707 +0000 UTC m=+1154.183508256" observedRunningTime="2025-11-25 16:17:56.106479876 +0000 UTC m=+1155.228319445" watchObservedRunningTime="2025-11-25 16:17:56.108942863 +0000 UTC m=+1155.230782422" Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.129795 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.129772596 podStartE2EDuration="2.129772596s" podCreationTimestamp="2025-11-25 16:17:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:17:56.122917811 +0000 UTC m=+1155.244757380" watchObservedRunningTime="2025-11-25 16:17:56.129772596 +0000 UTC m=+1155.251612145" Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.473996 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.520122 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnd67\" (UniqueName: \"kubernetes.io/projected/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-kube-api-access-vnd67\") pod \"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6\" (UID: \"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6\") " Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.521321 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-combined-ca-bundle\") pod \"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6\" (UID: \"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6\") " Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.521505 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-config-data\") pod \"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6\" (UID: \"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6\") " Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.532837 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-kube-api-access-vnd67" (OuterVolumeSpecName: "kube-api-access-vnd67") pod "b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6" (UID: "b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6"). InnerVolumeSpecName "kube-api-access-vnd67". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.550946 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6" (UID: "b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.559462 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-config-data" (OuterVolumeSpecName: "config-data") pod "b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6" (UID: "b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.623620 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnd67\" (UniqueName: \"kubernetes.io/projected/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-kube-api-access-vnd67\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.623654 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:56 crc kubenswrapper[4743]: I1125 16:17:56.623664 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.105142 4743 generic.go:334] "Generic (PLEG): container finished" podID="7cfa1f46-e909-4131-97c9-a8edde9e3a21" containerID="d9e46b692cb2bd8c6e5be1cdda753ba5eb2d84f88edc02f0d58e44fb852d26d3" exitCode=0 Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.105226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cfa1f46-e909-4131-97c9-a8edde9e3a21","Type":"ContainerDied","Data":"d9e46b692cb2bd8c6e5be1cdda753ba5eb2d84f88edc02f0d58e44fb852d26d3"} Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.107728 4743 generic.go:334] "Generic (PLEG): container finished" podID="b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6" containerID="02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea" exitCode=0 Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.107818 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.107866 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6","Type":"ContainerDied","Data":"02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea"} Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.107914 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6","Type":"ContainerDied","Data":"ae5bb29138513ed5f0aa21a70cb817966596070ebf2b1b172f0314ac1ba50a30"} Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.107937 4743 scope.go:117] "RemoveContainer" containerID="02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.153340 4743 scope.go:117] "RemoveContainer" containerID="02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea" Nov 25 16:17:57 crc kubenswrapper[4743]: E1125 16:17:57.154192 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea\": container with ID starting with 02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea not found: ID does not exist" containerID="02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.154225 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea"} err="failed to get container status \"02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea\": rpc error: code = NotFound desc = could not find container \"02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea\": container with ID starting with 02dc5b4bc70a1e8fa73ab08802ea7a081bfc623b4f97aa515df400fdf5a1bcea not found: ID does not exist" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.155744 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.185467 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.206700 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 16:17:57 crc kubenswrapper[4743]: E1125 16:17:57.207077 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6" containerName="nova-scheduler-scheduler" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.207091 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6" containerName="nova-scheduler-scheduler" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.207251 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6" containerName="nova-scheduler-scheduler" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.207912 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.210275 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.243680 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.244864 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rk5r\" (UniqueName: \"kubernetes.io/projected/e751348a-d82f-4927-8455-9e2e58468b60-kube-api-access-5rk5r\") pod \"nova-scheduler-0\" (UID: \"e751348a-d82f-4927-8455-9e2e58468b60\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.244981 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e751348a-d82f-4927-8455-9e2e58468b60-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e751348a-d82f-4927-8455-9e2e58468b60\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.245039 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e751348a-d82f-4927-8455-9e2e58468b60-config-data\") pod \"nova-scheduler-0\" (UID: \"e751348a-d82f-4927-8455-9e2e58468b60\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.322855 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.347479 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cfa1f46-e909-4131-97c9-a8edde9e3a21-config-data\") pod \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.347657 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cfa1f46-e909-4131-97c9-a8edde9e3a21-logs\") pod \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.347749 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wm7x\" (UniqueName: \"kubernetes.io/projected/7cfa1f46-e909-4131-97c9-a8edde9e3a21-kube-api-access-5wm7x\") pod \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.347866 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfa1f46-e909-4131-97c9-a8edde9e3a21-combined-ca-bundle\") pod \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\" (UID: \"7cfa1f46-e909-4131-97c9-a8edde9e3a21\") " Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.348137 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e751348a-d82f-4927-8455-9e2e58468b60-config-data\") pod \"nova-scheduler-0\" (UID: \"e751348a-d82f-4927-8455-9e2e58468b60\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.348282 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rk5r\" (UniqueName: \"kubernetes.io/projected/e751348a-d82f-4927-8455-9e2e58468b60-kube-api-access-5rk5r\") pod \"nova-scheduler-0\" (UID: \"e751348a-d82f-4927-8455-9e2e58468b60\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.348369 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e751348a-d82f-4927-8455-9e2e58468b60-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e751348a-d82f-4927-8455-9e2e58468b60\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.351788 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cfa1f46-e909-4131-97c9-a8edde9e3a21-logs" (OuterVolumeSpecName: "logs") pod "7cfa1f46-e909-4131-97c9-a8edde9e3a21" (UID: "7cfa1f46-e909-4131-97c9-a8edde9e3a21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.364126 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e751348a-d82f-4927-8455-9e2e58468b60-config-data\") pod \"nova-scheduler-0\" (UID: \"e751348a-d82f-4927-8455-9e2e58468b60\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.367783 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cfa1f46-e909-4131-97c9-a8edde9e3a21-kube-api-access-5wm7x" (OuterVolumeSpecName: "kube-api-access-5wm7x") pod "7cfa1f46-e909-4131-97c9-a8edde9e3a21" (UID: "7cfa1f46-e909-4131-97c9-a8edde9e3a21"). InnerVolumeSpecName "kube-api-access-5wm7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.368330 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e751348a-d82f-4927-8455-9e2e58468b60-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e751348a-d82f-4927-8455-9e2e58468b60\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.373180 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rk5r\" (UniqueName: \"kubernetes.io/projected/e751348a-d82f-4927-8455-9e2e58468b60-kube-api-access-5rk5r\") pod \"nova-scheduler-0\" (UID: \"e751348a-d82f-4927-8455-9e2e58468b60\") " pod="openstack/nova-scheduler-0" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.388505 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cfa1f46-e909-4131-97c9-a8edde9e3a21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cfa1f46-e909-4131-97c9-a8edde9e3a21" (UID: "7cfa1f46-e909-4131-97c9-a8edde9e3a21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.394109 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cfa1f46-e909-4131-97c9-a8edde9e3a21-config-data" (OuterVolumeSpecName: "config-data") pod "7cfa1f46-e909-4131-97c9-a8edde9e3a21" (UID: "7cfa1f46-e909-4131-97c9-a8edde9e3a21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.450927 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cfa1f46-e909-4131-97c9-a8edde9e3a21-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.450957 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wm7x\" (UniqueName: \"kubernetes.io/projected/7cfa1f46-e909-4131-97c9-a8edde9e3a21-kube-api-access-5wm7x\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.450968 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfa1f46-e909-4131-97c9-a8edde9e3a21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.450979 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cfa1f46-e909-4131-97c9-a8edde9e3a21-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.621815 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 16:17:57 crc kubenswrapper[4743]: I1125 16:17:57.786794 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6" path="/var/lib/kubelet/pods/b1ac2b54-2938-4ec4-b0d3-3ca63d834ff6/volumes" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.072033 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 16:17:58 crc kubenswrapper[4743]: W1125 16:17:58.074348 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode751348a_d82f_4927_8455_9e2e58468b60.slice/crio-48b4a13a85850b1248f5e3a832b0340ac13e764471a74bcf896e5b2e173f076c WatchSource:0}: Error finding container 48b4a13a85850b1248f5e3a832b0340ac13e764471a74bcf896e5b2e173f076c: Status 404 returned error can't find the container with id 48b4a13a85850b1248f5e3a832b0340ac13e764471a74bcf896e5b2e173f076c Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.130968 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cfa1f46-e909-4131-97c9-a8edde9e3a21","Type":"ContainerDied","Data":"1422f9b2b98dfc8015a2073160a6d7df6856909ef06aa1b9c35e2a19c7450940"} Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.131030 4743 scope.go:117] "RemoveContainer" containerID="d9e46b692cb2bd8c6e5be1cdda753ba5eb2d84f88edc02f0d58e44fb852d26d3" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.131186 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.137668 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e751348a-d82f-4927-8455-9e2e58468b60","Type":"ContainerStarted","Data":"48b4a13a85850b1248f5e3a832b0340ac13e764471a74bcf896e5b2e173f076c"} Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.296027 4743 scope.go:117] "RemoveContainer" containerID="0a580f72fc8cc629f46a7b419f45c9b4fddca319d2891593b286cab4c8c8248f" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.300304 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.321097 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.337352 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 16:17:58 crc kubenswrapper[4743]: E1125 16:17:58.337864 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cfa1f46-e909-4131-97c9-a8edde9e3a21" containerName="nova-api-api" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.337889 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cfa1f46-e909-4131-97c9-a8edde9e3a21" containerName="nova-api-api" Nov 25 16:17:58 crc kubenswrapper[4743]: E1125 16:17:58.337925 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cfa1f46-e909-4131-97c9-a8edde9e3a21" containerName="nova-api-log" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.337934 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cfa1f46-e909-4131-97c9-a8edde9e3a21" containerName="nova-api-log" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.338169 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cfa1f46-e909-4131-97c9-a8edde9e3a21" containerName="nova-api-log" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.338191 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cfa1f46-e909-4131-97c9-a8edde9e3a21" containerName="nova-api-api" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.339487 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 16:17:58 crc kubenswrapper[4743]: E1125 16:17:58.342198 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod796f4930_dad8_4c02_9ffa_00df9a6689ff.slice/crio-conmon-e88c6afbcb1a7d264540f7bf583e05d03e92134479a4a53c4fc27362be2d39b0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod796f4930_dad8_4c02_9ffa_00df9a6689ff.slice/crio-e88c6afbcb1a7d264540f7bf583e05d03e92134479a4a53c4fc27362be2d39b0.scope\": RecentStats: unable to find data in memory cache]" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.344060 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.351052 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.369641 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/787e6662-95aa-4bd6-80ce-326cf56c04e0-config-data\") pod \"nova-api-0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " pod="openstack/nova-api-0" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.370191 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/787e6662-95aa-4bd6-80ce-326cf56c04e0-logs\") pod \"nova-api-0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " pod="openstack/nova-api-0" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.370288 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtgrj\" (UniqueName: \"kubernetes.io/projected/787e6662-95aa-4bd6-80ce-326cf56c04e0-kube-api-access-wtgrj\") pod \"nova-api-0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " pod="openstack/nova-api-0" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.370316 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/787e6662-95aa-4bd6-80ce-326cf56c04e0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " pod="openstack/nova-api-0" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.472152 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtgrj\" (UniqueName: \"kubernetes.io/projected/787e6662-95aa-4bd6-80ce-326cf56c04e0-kube-api-access-wtgrj\") pod \"nova-api-0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " pod="openstack/nova-api-0" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.472202 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/787e6662-95aa-4bd6-80ce-326cf56c04e0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " pod="openstack/nova-api-0" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.472342 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/787e6662-95aa-4bd6-80ce-326cf56c04e0-config-data\") pod \"nova-api-0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " pod="openstack/nova-api-0" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.472366 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/787e6662-95aa-4bd6-80ce-326cf56c04e0-logs\") pod \"nova-api-0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " pod="openstack/nova-api-0" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.472918 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/787e6662-95aa-4bd6-80ce-326cf56c04e0-logs\") pod \"nova-api-0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " pod="openstack/nova-api-0" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.476220 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/787e6662-95aa-4bd6-80ce-326cf56c04e0-config-data\") pod \"nova-api-0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " pod="openstack/nova-api-0" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.476517 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/787e6662-95aa-4bd6-80ce-326cf56c04e0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " pod="openstack/nova-api-0" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.489851 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtgrj\" (UniqueName: \"kubernetes.io/projected/787e6662-95aa-4bd6-80ce-326cf56c04e0-kube-api-access-wtgrj\") pod \"nova-api-0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " pod="openstack/nova-api-0" Nov 25 16:17:58 crc kubenswrapper[4743]: I1125 16:17:58.659923 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 16:17:59 crc kubenswrapper[4743]: I1125 16:17:59.103270 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:17:59 crc kubenswrapper[4743]: I1125 16:17:59.148618 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e751348a-d82f-4927-8455-9e2e58468b60","Type":"ContainerStarted","Data":"2709003249e749553b1028d1723c6db3b22d7af440ea94e4c2a33fe1d3f0c73e"} Nov 25 16:17:59 crc kubenswrapper[4743]: I1125 16:17:59.150254 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"787e6662-95aa-4bd6-80ce-326cf56c04e0","Type":"ContainerStarted","Data":"b725202e4a6dc3c9474adfaf7b0c84e464c086e7dafbcd684b6b5010bf7438ca"} Nov 25 16:17:59 crc kubenswrapper[4743]: I1125 16:17:59.151425 4743 generic.go:334] "Generic (PLEG): container finished" podID="796f4930-dad8-4c02-9ffa-00df9a6689ff" containerID="e88c6afbcb1a7d264540f7bf583e05d03e92134479a4a53c4fc27362be2d39b0" exitCode=0 Nov 25 16:17:59 crc kubenswrapper[4743]: I1125 16:17:59.151462 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xssnc" event={"ID":"796f4930-dad8-4c02-9ffa-00df9a6689ff","Type":"ContainerDied","Data":"e88c6afbcb1a7d264540f7bf583e05d03e92134479a4a53c4fc27362be2d39b0"} Nov 25 16:17:59 crc kubenswrapper[4743]: I1125 16:17:59.166504 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.166489683 podStartE2EDuration="2.166489683s" podCreationTimestamp="2025-11-25 16:17:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:17:59.164071726 +0000 UTC m=+1158.285911275" watchObservedRunningTime="2025-11-25 16:17:59.166489683 +0000 UTC m=+1158.288329232" Nov 25 16:17:59 crc kubenswrapper[4743]: I1125 16:17:59.536150 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 16:17:59 crc kubenswrapper[4743]: I1125 16:17:59.536298 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 16:17:59 crc kubenswrapper[4743]: I1125 16:17:59.786157 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cfa1f46-e909-4131-97c9-a8edde9e3a21" path="/var/lib/kubelet/pods/7cfa1f46-e909-4131-97c9-a8edde9e3a21/volumes" Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.166332 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"787e6662-95aa-4bd6-80ce-326cf56c04e0","Type":"ContainerStarted","Data":"21057c5aff1571da9a4a1f5fc0b3f3879577b95e367dffd106200452222345f9"} Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.166883 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"787e6662-95aa-4bd6-80ce-326cf56c04e0","Type":"ContainerStarted","Data":"936e6951806bc789f5c002a307de2d09b6ebf0076e009ed64b32d3878d94ac64"} Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.189548 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.189526047 podStartE2EDuration="2.189526047s" podCreationTimestamp="2025-11-25 16:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:18:00.182177896 +0000 UTC m=+1159.304017455" watchObservedRunningTime="2025-11-25 16:18:00.189526047 +0000 UTC m=+1159.311365596" Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.510694 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.618949 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-scripts\") pod \"796f4930-dad8-4c02-9ffa-00df9a6689ff\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.619138 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2twf6\" (UniqueName: \"kubernetes.io/projected/796f4930-dad8-4c02-9ffa-00df9a6689ff-kube-api-access-2twf6\") pod \"796f4930-dad8-4c02-9ffa-00df9a6689ff\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.619181 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-combined-ca-bundle\") pod \"796f4930-dad8-4c02-9ffa-00df9a6689ff\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.619339 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-config-data\") pod \"796f4930-dad8-4c02-9ffa-00df9a6689ff\" (UID: \"796f4930-dad8-4c02-9ffa-00df9a6689ff\") " Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.625036 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/796f4930-dad8-4c02-9ffa-00df9a6689ff-kube-api-access-2twf6" (OuterVolumeSpecName: "kube-api-access-2twf6") pod "796f4930-dad8-4c02-9ffa-00df9a6689ff" (UID: "796f4930-dad8-4c02-9ffa-00df9a6689ff"). InnerVolumeSpecName "kube-api-access-2twf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.625728 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-scripts" (OuterVolumeSpecName: "scripts") pod "796f4930-dad8-4c02-9ffa-00df9a6689ff" (UID: "796f4930-dad8-4c02-9ffa-00df9a6689ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.646784 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "796f4930-dad8-4c02-9ffa-00df9a6689ff" (UID: "796f4930-dad8-4c02-9ffa-00df9a6689ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.648254 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-config-data" (OuterVolumeSpecName: "config-data") pod "796f4930-dad8-4c02-9ffa-00df9a6689ff" (UID: "796f4930-dad8-4c02-9ffa-00df9a6689ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.722492 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2twf6\" (UniqueName: \"kubernetes.io/projected/796f4930-dad8-4c02-9ffa-00df9a6689ff-kube-api-access-2twf6\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.722553 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.722584 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:00 crc kubenswrapper[4743]: I1125 16:18:00.722635 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/796f4930-dad8-4c02-9ffa-00df9a6689ff-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.176063 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xssnc" event={"ID":"796f4930-dad8-4c02-9ffa-00df9a6689ff","Type":"ContainerDied","Data":"8d89dd55b0ae2b53b17b6715cd7a5bdb0f7d66e046549294c818d4d870e49f68"} Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.176664 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d89dd55b0ae2b53b17b6715cd7a5bdb0f7d66e046549294c818d4d870e49f68" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.176079 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xssnc" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.261770 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 25 16:18:01 crc kubenswrapper[4743]: E1125 16:18:01.262284 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="796f4930-dad8-4c02-9ffa-00df9a6689ff" containerName="nova-cell1-conductor-db-sync" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.262306 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="796f4930-dad8-4c02-9ffa-00df9a6689ff" containerName="nova-cell1-conductor-db-sync" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.262535 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="796f4930-dad8-4c02-9ffa-00df9a6689ff" containerName="nova-cell1-conductor-db-sync" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.263370 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.265515 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.271404 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.333049 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66fd62a5-dbf6-4ff3-a910-1969f287da86-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"66fd62a5-dbf6-4ff3-a910-1969f287da86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.333110 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrrgc\" (UniqueName: \"kubernetes.io/projected/66fd62a5-dbf6-4ff3-a910-1969f287da86-kube-api-access-zrrgc\") pod \"nova-cell1-conductor-0\" (UID: \"66fd62a5-dbf6-4ff3-a910-1969f287da86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.333189 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66fd62a5-dbf6-4ff3-a910-1969f287da86-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"66fd62a5-dbf6-4ff3-a910-1969f287da86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.434769 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66fd62a5-dbf6-4ff3-a910-1969f287da86-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"66fd62a5-dbf6-4ff3-a910-1969f287da86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.434830 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrrgc\" (UniqueName: \"kubernetes.io/projected/66fd62a5-dbf6-4ff3-a910-1969f287da86-kube-api-access-zrrgc\") pod \"nova-cell1-conductor-0\" (UID: \"66fd62a5-dbf6-4ff3-a910-1969f287da86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.434891 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66fd62a5-dbf6-4ff3-a910-1969f287da86-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"66fd62a5-dbf6-4ff3-a910-1969f287da86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.441080 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66fd62a5-dbf6-4ff3-a910-1969f287da86-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"66fd62a5-dbf6-4ff3-a910-1969f287da86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.445525 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66fd62a5-dbf6-4ff3-a910-1969f287da86-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"66fd62a5-dbf6-4ff3-a910-1969f287da86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.451451 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrrgc\" (UniqueName: \"kubernetes.io/projected/66fd62a5-dbf6-4ff3-a910-1969f287da86-kube-api-access-zrrgc\") pod \"nova-cell1-conductor-0\" (UID: \"66fd62a5-dbf6-4ff3-a910-1969f287da86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 16:18:01 crc kubenswrapper[4743]: I1125 16:18:01.579737 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 25 16:18:02 crc kubenswrapper[4743]: I1125 16:18:02.046225 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 25 16:18:02 crc kubenswrapper[4743]: W1125 16:18:02.047884 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66fd62a5_dbf6_4ff3_a910_1969f287da86.slice/crio-099ca889e18476a0d7ab2ece36088639f7184a02de3ce3e3b9e0877b96e92193 WatchSource:0}: Error finding container 099ca889e18476a0d7ab2ece36088639f7184a02de3ce3e3b9e0877b96e92193: Status 404 returned error can't find the container with id 099ca889e18476a0d7ab2ece36088639f7184a02de3ce3e3b9e0877b96e92193 Nov 25 16:18:02 crc kubenswrapper[4743]: I1125 16:18:02.186804 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"66fd62a5-dbf6-4ff3-a910-1969f287da86","Type":"ContainerStarted","Data":"099ca889e18476a0d7ab2ece36088639f7184a02de3ce3e3b9e0877b96e92193"} Nov 25 16:18:02 crc kubenswrapper[4743]: I1125 16:18:02.622154 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 25 16:18:03 crc kubenswrapper[4743]: I1125 16:18:03.198906 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"66fd62a5-dbf6-4ff3-a910-1969f287da86","Type":"ContainerStarted","Data":"74db5f33fa030141d609352a8d32d66353e86c650a42f5fc05d804ed509574de"} Nov 25 16:18:03 crc kubenswrapper[4743]: I1125 16:18:03.199189 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 25 16:18:03 crc kubenswrapper[4743]: I1125 16:18:03.222443 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.222420753 podStartE2EDuration="2.222420753s" podCreationTimestamp="2025-11-25 16:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:18:03.216879019 +0000 UTC m=+1162.338718578" watchObservedRunningTime="2025-11-25 16:18:03.222420753 +0000 UTC m=+1162.344260302" Nov 25 16:18:04 crc kubenswrapper[4743]: I1125 16:18:04.535856 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 16:18:04 crc kubenswrapper[4743]: I1125 16:18:04.536341 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 16:18:05 crc kubenswrapper[4743]: I1125 16:18:05.548852 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 16:18:05 crc kubenswrapper[4743]: I1125 16:18:05.548889 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 16:18:07 crc kubenswrapper[4743]: I1125 16:18:07.622829 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 25 16:18:07 crc kubenswrapper[4743]: I1125 16:18:07.650033 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 25 16:18:08 crc kubenswrapper[4743]: I1125 16:18:08.320494 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 25 16:18:08 crc kubenswrapper[4743]: I1125 16:18:08.660972 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 16:18:08 crc kubenswrapper[4743]: I1125 16:18:08.661047 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 16:18:09 crc kubenswrapper[4743]: I1125 16:18:09.742767 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="787e6662-95aa-4bd6-80ce-326cf56c04e0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 16:18:09 crc kubenswrapper[4743]: I1125 16:18:09.742815 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="787e6662-95aa-4bd6-80ce-326cf56c04e0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 16:18:11 crc kubenswrapper[4743]: I1125 16:18:11.620794 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 25 16:18:14 crc kubenswrapper[4743]: I1125 16:18:14.544824 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 16:18:14 crc kubenswrapper[4743]: I1125 16:18:14.545372 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 16:18:14 crc kubenswrapper[4743]: I1125 16:18:14.553586 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 16:18:14 crc kubenswrapper[4743]: I1125 16:18:14.554007 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 16:18:16 crc kubenswrapper[4743]: I1125 16:18:16.351093 4743 generic.go:334] "Generic (PLEG): container finished" podID="815b7932-2bb0-47c6-a8e2-c182484259c4" containerID="cbed122c43dc1a205f1b74e157100f22782c4010ea152a4d0a3ce8ef12e9f336" exitCode=137 Nov 25 16:18:16 crc kubenswrapper[4743]: I1125 16:18:16.351164 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"815b7932-2bb0-47c6-a8e2-c182484259c4","Type":"ContainerDied","Data":"cbed122c43dc1a205f1b74e157100f22782c4010ea152a4d0a3ce8ef12e9f336"} Nov 25 16:18:16 crc kubenswrapper[4743]: I1125 16:18:16.351571 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"815b7932-2bb0-47c6-a8e2-c182484259c4","Type":"ContainerDied","Data":"63454015e6c14c3b25bd9a19a64790aa43f7f7bbd304ff4ceb9cdb2ace22e70e"} Nov 25 16:18:16 crc kubenswrapper[4743]: I1125 16:18:16.351695 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63454015e6c14c3b25bd9a19a64790aa43f7f7bbd304ff4ceb9cdb2ace22e70e" Nov 25 16:18:16 crc kubenswrapper[4743]: I1125 16:18:16.352499 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:16 crc kubenswrapper[4743]: I1125 16:18:16.426789 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815b7932-2bb0-47c6-a8e2-c182484259c4-config-data\") pod \"815b7932-2bb0-47c6-a8e2-c182484259c4\" (UID: \"815b7932-2bb0-47c6-a8e2-c182484259c4\") " Nov 25 16:18:16 crc kubenswrapper[4743]: I1125 16:18:16.427561 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815b7932-2bb0-47c6-a8e2-c182484259c4-combined-ca-bundle\") pod \"815b7932-2bb0-47c6-a8e2-c182484259c4\" (UID: \"815b7932-2bb0-47c6-a8e2-c182484259c4\") " Nov 25 16:18:16 crc kubenswrapper[4743]: I1125 16:18:16.427758 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtjlq\" (UniqueName: \"kubernetes.io/projected/815b7932-2bb0-47c6-a8e2-c182484259c4-kube-api-access-mtjlq\") pod \"815b7932-2bb0-47c6-a8e2-c182484259c4\" (UID: \"815b7932-2bb0-47c6-a8e2-c182484259c4\") " Nov 25 16:18:16 crc kubenswrapper[4743]: I1125 16:18:16.433326 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815b7932-2bb0-47c6-a8e2-c182484259c4-kube-api-access-mtjlq" (OuterVolumeSpecName: "kube-api-access-mtjlq") pod "815b7932-2bb0-47c6-a8e2-c182484259c4" (UID: "815b7932-2bb0-47c6-a8e2-c182484259c4"). InnerVolumeSpecName "kube-api-access-mtjlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:18:16 crc kubenswrapper[4743]: I1125 16:18:16.456773 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815b7932-2bb0-47c6-a8e2-c182484259c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "815b7932-2bb0-47c6-a8e2-c182484259c4" (UID: "815b7932-2bb0-47c6-a8e2-c182484259c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:16 crc kubenswrapper[4743]: I1125 16:18:16.475889 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815b7932-2bb0-47c6-a8e2-c182484259c4-config-data" (OuterVolumeSpecName: "config-data") pod "815b7932-2bb0-47c6-a8e2-c182484259c4" (UID: "815b7932-2bb0-47c6-a8e2-c182484259c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:16 crc kubenswrapper[4743]: I1125 16:18:16.529870 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815b7932-2bb0-47c6-a8e2-c182484259c4-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:16 crc kubenswrapper[4743]: I1125 16:18:16.529903 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815b7932-2bb0-47c6-a8e2-c182484259c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:16 crc kubenswrapper[4743]: I1125 16:18:16.529916 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtjlq\" (UniqueName: \"kubernetes.io/projected/815b7932-2bb0-47c6-a8e2-c182484259c4-kube-api-access-mtjlq\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.359148 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.388813 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.396296 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.405342 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 16:18:17 crc kubenswrapper[4743]: E1125 16:18:17.405796 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815b7932-2bb0-47c6-a8e2-c182484259c4" containerName="nova-cell1-novncproxy-novncproxy" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.405813 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="815b7932-2bb0-47c6-a8e2-c182484259c4" containerName="nova-cell1-novncproxy-novncproxy" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.406025 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="815b7932-2bb0-47c6-a8e2-c182484259c4" containerName="nova-cell1-novncproxy-novncproxy" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.406648 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.409278 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.409373 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.409452 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.413402 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.444415 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d229e467-a473-44bf-9f13-73155f796874-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d229e467-a473-44bf-9f13-73155f796874\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.444705 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d229e467-a473-44bf-9f13-73155f796874-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d229e467-a473-44bf-9f13-73155f796874\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.444828 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkjhr\" (UniqueName: \"kubernetes.io/projected/d229e467-a473-44bf-9f13-73155f796874-kube-api-access-wkjhr\") pod \"nova-cell1-novncproxy-0\" (UID: \"d229e467-a473-44bf-9f13-73155f796874\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.444951 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d229e467-a473-44bf-9f13-73155f796874-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d229e467-a473-44bf-9f13-73155f796874\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.445030 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d229e467-a473-44bf-9f13-73155f796874-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d229e467-a473-44bf-9f13-73155f796874\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.547462 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d229e467-a473-44bf-9f13-73155f796874-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d229e467-a473-44bf-9f13-73155f796874\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.547509 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d229e467-a473-44bf-9f13-73155f796874-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d229e467-a473-44bf-9f13-73155f796874\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.547537 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkjhr\" (UniqueName: \"kubernetes.io/projected/d229e467-a473-44bf-9f13-73155f796874-kube-api-access-wkjhr\") pod \"nova-cell1-novncproxy-0\" (UID: \"d229e467-a473-44bf-9f13-73155f796874\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.547575 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d229e467-a473-44bf-9f13-73155f796874-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d229e467-a473-44bf-9f13-73155f796874\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.547611 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d229e467-a473-44bf-9f13-73155f796874-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d229e467-a473-44bf-9f13-73155f796874\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.555182 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d229e467-a473-44bf-9f13-73155f796874-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d229e467-a473-44bf-9f13-73155f796874\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.555182 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d229e467-a473-44bf-9f13-73155f796874-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d229e467-a473-44bf-9f13-73155f796874\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.555318 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d229e467-a473-44bf-9f13-73155f796874-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d229e467-a473-44bf-9f13-73155f796874\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.555692 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d229e467-a473-44bf-9f13-73155f796874-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d229e467-a473-44bf-9f13-73155f796874\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.564647 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkjhr\" (UniqueName: \"kubernetes.io/projected/d229e467-a473-44bf-9f13-73155f796874-kube-api-access-wkjhr\") pod \"nova-cell1-novncproxy-0\" (UID: \"d229e467-a473-44bf-9f13-73155f796874\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.609361 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.722883 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:17 crc kubenswrapper[4743]: I1125 16:18:17.790948 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815b7932-2bb0-47c6-a8e2-c182484259c4" path="/var/lib/kubelet/pods/815b7932-2bb0-47c6-a8e2-c182484259c4/volumes" Nov 25 16:18:18 crc kubenswrapper[4743]: I1125 16:18:18.149674 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 16:18:18 crc kubenswrapper[4743]: W1125 16:18:18.150936 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd229e467_a473_44bf_9f13_73155f796874.slice/crio-42de2b9cd0635ea33c1b061f2efdcde5452fb446a7e0585fb801be2df4ad6c96 WatchSource:0}: Error finding container 42de2b9cd0635ea33c1b061f2efdcde5452fb446a7e0585fb801be2df4ad6c96: Status 404 returned error can't find the container with id 42de2b9cd0635ea33c1b061f2efdcde5452fb446a7e0585fb801be2df4ad6c96 Nov 25 16:18:18 crc kubenswrapper[4743]: I1125 16:18:18.371203 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d229e467-a473-44bf-9f13-73155f796874","Type":"ContainerStarted","Data":"42de2b9cd0635ea33c1b061f2efdcde5452fb446a7e0585fb801be2df4ad6c96"} Nov 25 16:18:18 crc kubenswrapper[4743]: I1125 16:18:18.664603 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 16:18:18 crc kubenswrapper[4743]: I1125 16:18:18.665356 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 16:18:18 crc kubenswrapper[4743]: I1125 16:18:18.666891 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 16:18:18 crc kubenswrapper[4743]: I1125 16:18:18.669174 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.385559 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d229e467-a473-44bf-9f13-73155f796874","Type":"ContainerStarted","Data":"4485c7a4136d7fd15ca426af0f1ce748e3e23635ff7f026a46a9d1947e0bcd85"} Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.385761 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.393826 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.411396 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.411365762 podStartE2EDuration="2.411365762s" podCreationTimestamp="2025-11-25 16:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:18:19.399998275 +0000 UTC m=+1178.521837834" watchObservedRunningTime="2025-11-25 16:18:19.411365762 +0000 UTC m=+1178.533205351" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.580186 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kxj8h"] Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.583367 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.678836 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kxj8h"] Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.700828 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.701013 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5tcw\" (UniqueName: \"kubernetes.io/projected/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-kube-api-access-v5tcw\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.701076 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.701118 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.701189 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-config\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.701210 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.802500 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5tcw\" (UniqueName: \"kubernetes.io/projected/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-kube-api-access-v5tcw\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.802603 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.802634 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.802676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-config\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.802693 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.802760 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.803752 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.803752 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.803977 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.804267 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.804536 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-config\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.824967 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5tcw\" (UniqueName: \"kubernetes.io/projected/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-kube-api-access-v5tcw\") pod \"dnsmasq-dns-89c5cd4d5-kxj8h\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:19 crc kubenswrapper[4743]: I1125 16:18:19.916542 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:20 crc kubenswrapper[4743]: I1125 16:18:20.356852 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kxj8h"] Nov 25 16:18:20 crc kubenswrapper[4743]: I1125 16:18:20.396610 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" event={"ID":"fdc16f4c-0cde-4c16-86b8-44b0cab38e72","Type":"ContainerStarted","Data":"9aeea0fb2b8516cd4b0d4c380b22fd7d6edaa6635b494e301edabef474bb05ce"} Nov 25 16:18:21 crc kubenswrapper[4743]: I1125 16:18:21.183166 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:18:21 crc kubenswrapper[4743]: I1125 16:18:21.183801 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="ceilometer-central-agent" containerID="cri-o://4394a5b6d239405f07da4e592c51984b5a9db79804bca4b927be7b1cf4ff194d" gracePeriod=30 Nov 25 16:18:21 crc kubenswrapper[4743]: I1125 16:18:21.183918 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="sg-core" containerID="cri-o://fbd38eb8cbd146c06de2de7a954ed041d2de374f3f45e4783bbec0a43f15cc0f" gracePeriod=30 Nov 25 16:18:21 crc kubenswrapper[4743]: I1125 16:18:21.184998 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="proxy-httpd" containerID="cri-o://eaa4dc3e96f54f6a0e7f3832e5c71a8fde266fdf0189bdd829aa2cc54c5fabc4" gracePeriod=30 Nov 25 16:18:21 crc kubenswrapper[4743]: I1125 16:18:21.185195 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="ceilometer-notification-agent" containerID="cri-o://6ec3aff340c586ec6b7971e16093e4df4cba00b97a603250194bbdb47ca6ace4" gracePeriod=30 Nov 25 16:18:21 crc kubenswrapper[4743]: I1125 16:18:21.409050 4743 generic.go:334] "Generic (PLEG): container finished" podID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerID="eaa4dc3e96f54f6a0e7f3832e5c71a8fde266fdf0189bdd829aa2cc54c5fabc4" exitCode=0 Nov 25 16:18:21 crc kubenswrapper[4743]: I1125 16:18:21.409374 4743 generic.go:334] "Generic (PLEG): container finished" podID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerID="fbd38eb8cbd146c06de2de7a954ed041d2de374f3f45e4783bbec0a43f15cc0f" exitCode=2 Nov 25 16:18:21 crc kubenswrapper[4743]: I1125 16:18:21.409133 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53189231-e2dd-4b95-aec8-0f07fec6c495","Type":"ContainerDied","Data":"eaa4dc3e96f54f6a0e7f3832e5c71a8fde266fdf0189bdd829aa2cc54c5fabc4"} Nov 25 16:18:21 crc kubenswrapper[4743]: I1125 16:18:21.409453 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53189231-e2dd-4b95-aec8-0f07fec6c495","Type":"ContainerDied","Data":"fbd38eb8cbd146c06de2de7a954ed041d2de374f3f45e4783bbec0a43f15cc0f"} Nov 25 16:18:21 crc kubenswrapper[4743]: I1125 16:18:21.411035 4743 generic.go:334] "Generic (PLEG): container finished" podID="fdc16f4c-0cde-4c16-86b8-44b0cab38e72" containerID="b7ba280e66c50c1a41c253255e88bb03f1095c7a5e4cf0efef77357ee3db1bce" exitCode=0 Nov 25 16:18:21 crc kubenswrapper[4743]: I1125 16:18:21.411121 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" event={"ID":"fdc16f4c-0cde-4c16-86b8-44b0cab38e72","Type":"ContainerDied","Data":"b7ba280e66c50c1a41c253255e88bb03f1095c7a5e4cf0efef77357ee3db1bce"} Nov 25 16:18:22 crc kubenswrapper[4743]: I1125 16:18:22.047618 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:18:22 crc kubenswrapper[4743]: I1125 16:18:22.423524 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" event={"ID":"fdc16f4c-0cde-4c16-86b8-44b0cab38e72","Type":"ContainerStarted","Data":"e3bccbc836f8f554ab576d3a944443744189ea4a19278f3f39b19238bed50ff1"} Nov 25 16:18:22 crc kubenswrapper[4743]: I1125 16:18:22.424897 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:22 crc kubenswrapper[4743]: I1125 16:18:22.433815 4743 generic.go:334] "Generic (PLEG): container finished" podID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerID="4394a5b6d239405f07da4e592c51984b5a9db79804bca4b927be7b1cf4ff194d" exitCode=0 Nov 25 16:18:22 crc kubenswrapper[4743]: I1125 16:18:22.433992 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="787e6662-95aa-4bd6-80ce-326cf56c04e0" containerName="nova-api-log" containerID="cri-o://936e6951806bc789f5c002a307de2d09b6ebf0076e009ed64b32d3878d94ac64" gracePeriod=30 Nov 25 16:18:22 crc kubenswrapper[4743]: I1125 16:18:22.434197 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53189231-e2dd-4b95-aec8-0f07fec6c495","Type":"ContainerDied","Data":"4394a5b6d239405f07da4e592c51984b5a9db79804bca4b927be7b1cf4ff194d"} Nov 25 16:18:22 crc kubenswrapper[4743]: I1125 16:18:22.434275 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="787e6662-95aa-4bd6-80ce-326cf56c04e0" containerName="nova-api-api" containerID="cri-o://21057c5aff1571da9a4a1f5fc0b3f3879577b95e367dffd106200452222345f9" gracePeriod=30 Nov 25 16:18:22 crc kubenswrapper[4743]: I1125 16:18:22.447278 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" podStartSLOduration=3.447255572 podStartE2EDuration="3.447255572s" podCreationTimestamp="2025-11-25 16:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:18:22.444289269 +0000 UTC m=+1181.566128818" watchObservedRunningTime="2025-11-25 16:18:22.447255572 +0000 UTC m=+1181.569095121" Nov 25 16:18:22 crc kubenswrapper[4743]: I1125 16:18:22.723904 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.446413 4743 generic.go:334] "Generic (PLEG): container finished" podID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerID="6ec3aff340c586ec6b7971e16093e4df4cba00b97a603250194bbdb47ca6ace4" exitCode=0 Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.446785 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53189231-e2dd-4b95-aec8-0f07fec6c495","Type":"ContainerDied","Data":"6ec3aff340c586ec6b7971e16093e4df4cba00b97a603250194bbdb47ca6ace4"} Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.446810 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53189231-e2dd-4b95-aec8-0f07fec6c495","Type":"ContainerDied","Data":"21243c1d1bff8a2a071c45731e190e8578207e3873a8cc8448839268fc7f6a79"} Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.446821 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21243c1d1bff8a2a071c45731e190e8578207e3873a8cc8448839268fc7f6a79" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.449616 4743 generic.go:334] "Generic (PLEG): container finished" podID="787e6662-95aa-4bd6-80ce-326cf56c04e0" containerID="936e6951806bc789f5c002a307de2d09b6ebf0076e009ed64b32d3878d94ac64" exitCode=143 Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.450446 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"787e6662-95aa-4bd6-80ce-326cf56c04e0","Type":"ContainerDied","Data":"936e6951806bc789f5c002a307de2d09b6ebf0076e009ed64b32d3878d94ac64"} Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.519199 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.587571 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53189231-e2dd-4b95-aec8-0f07fec6c495-log-httpd\") pod \"53189231-e2dd-4b95-aec8-0f07fec6c495\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.587668 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-sg-core-conf-yaml\") pod \"53189231-e2dd-4b95-aec8-0f07fec6c495\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.587772 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-scripts\") pod \"53189231-e2dd-4b95-aec8-0f07fec6c495\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.587820 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-config-data\") pod \"53189231-e2dd-4b95-aec8-0f07fec6c495\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.588786 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-combined-ca-bundle\") pod \"53189231-e2dd-4b95-aec8-0f07fec6c495\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.589093 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53189231-e2dd-4b95-aec8-0f07fec6c495-run-httpd\") pod \"53189231-e2dd-4b95-aec8-0f07fec6c495\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.589151 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53189231-e2dd-4b95-aec8-0f07fec6c495-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "53189231-e2dd-4b95-aec8-0f07fec6c495" (UID: "53189231-e2dd-4b95-aec8-0f07fec6c495"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.589171 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7759l\" (UniqueName: \"kubernetes.io/projected/53189231-e2dd-4b95-aec8-0f07fec6c495-kube-api-access-7759l\") pod \"53189231-e2dd-4b95-aec8-0f07fec6c495\" (UID: \"53189231-e2dd-4b95-aec8-0f07fec6c495\") " Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.589442 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53189231-e2dd-4b95-aec8-0f07fec6c495-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "53189231-e2dd-4b95-aec8-0f07fec6c495" (UID: "53189231-e2dd-4b95-aec8-0f07fec6c495"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.594176 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-scripts" (OuterVolumeSpecName: "scripts") pod "53189231-e2dd-4b95-aec8-0f07fec6c495" (UID: "53189231-e2dd-4b95-aec8-0f07fec6c495"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.595031 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53189231-e2dd-4b95-aec8-0f07fec6c495-kube-api-access-7759l" (OuterVolumeSpecName: "kube-api-access-7759l") pod "53189231-e2dd-4b95-aec8-0f07fec6c495" (UID: "53189231-e2dd-4b95-aec8-0f07fec6c495"). InnerVolumeSpecName "kube-api-access-7759l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.596939 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.597311 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53189231-e2dd-4b95-aec8-0f07fec6c495-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.597332 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7759l\" (UniqueName: \"kubernetes.io/projected/53189231-e2dd-4b95-aec8-0f07fec6c495-kube-api-access-7759l\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.597343 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53189231-e2dd-4b95-aec8-0f07fec6c495-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.629862 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "53189231-e2dd-4b95-aec8-0f07fec6c495" (UID: "53189231-e2dd-4b95-aec8-0f07fec6c495"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.671740 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53189231-e2dd-4b95-aec8-0f07fec6c495" (UID: "53189231-e2dd-4b95-aec8-0f07fec6c495"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.698751 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.698782 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.721183 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-config-data" (OuterVolumeSpecName: "config-data") pod "53189231-e2dd-4b95-aec8-0f07fec6c495" (UID: "53189231-e2dd-4b95-aec8-0f07fec6c495"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:23 crc kubenswrapper[4743]: I1125 16:18:23.800734 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53189231-e2dd-4b95-aec8-0f07fec6c495-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.456280 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.479429 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.535613 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.552949 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:18:24 crc kubenswrapper[4743]: E1125 16:18:24.553398 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="ceilometer-notification-agent" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.553419 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="ceilometer-notification-agent" Nov 25 16:18:24 crc kubenswrapper[4743]: E1125 16:18:24.553437 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="ceilometer-central-agent" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.553444 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="ceilometer-central-agent" Nov 25 16:18:24 crc kubenswrapper[4743]: E1125 16:18:24.553460 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="sg-core" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.553467 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="sg-core" Nov 25 16:18:24 crc kubenswrapper[4743]: E1125 16:18:24.553502 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="proxy-httpd" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.553508 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="proxy-httpd" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.553697 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="sg-core" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.553727 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="ceilometer-notification-agent" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.553739 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="ceilometer-central-agent" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.553750 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" containerName="proxy-httpd" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.555447 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.557928 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.559579 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.564436 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.621287 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-config-data\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.621417 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-run-httpd\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.621462 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-log-httpd\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.621715 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.621794 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-scripts\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.621882 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.621910 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwgt2\" (UniqueName: \"kubernetes.io/projected/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-kube-api-access-jwgt2\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.723784 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-config-data\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.723886 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-run-httpd\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.723924 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-log-httpd\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.723966 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.723988 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-scripts\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.724020 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.724052 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwgt2\" (UniqueName: \"kubernetes.io/projected/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-kube-api-access-jwgt2\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.725010 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-log-httpd\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.725050 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-run-httpd\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.729507 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.729584 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-config-data\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.729898 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-scripts\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.730041 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.746722 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwgt2\" (UniqueName: \"kubernetes.io/projected/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-kube-api-access-jwgt2\") pod \"ceilometer-0\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " pod="openstack/ceilometer-0" Nov 25 16:18:24 crc kubenswrapper[4743]: I1125 16:18:24.871994 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:18:25 crc kubenswrapper[4743]: I1125 16:18:25.327757 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:18:25 crc kubenswrapper[4743]: W1125 16:18:25.330207 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5561798a_48bc_4ffa_9ce3_0e7278c5f1b9.slice/crio-7c3c925bd8be5a919b6233e1d892b59fd6a5cd55df34bf16a1aa9cff10b312df WatchSource:0}: Error finding container 7c3c925bd8be5a919b6233e1d892b59fd6a5cd55df34bf16a1aa9cff10b312df: Status 404 returned error can't find the container with id 7c3c925bd8be5a919b6233e1d892b59fd6a5cd55df34bf16a1aa9cff10b312df Nov 25 16:18:25 crc kubenswrapper[4743]: I1125 16:18:25.474036 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9","Type":"ContainerStarted","Data":"7c3c925bd8be5a919b6233e1d892b59fd6a5cd55df34bf16a1aa9cff10b312df"} Nov 25 16:18:25 crc kubenswrapper[4743]: I1125 16:18:25.785747 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53189231-e2dd-4b95-aec8-0f07fec6c495" path="/var/lib/kubelet/pods/53189231-e2dd-4b95-aec8-0f07fec6c495/volumes" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.017419 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.154696 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtgrj\" (UniqueName: \"kubernetes.io/projected/787e6662-95aa-4bd6-80ce-326cf56c04e0-kube-api-access-wtgrj\") pod \"787e6662-95aa-4bd6-80ce-326cf56c04e0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.154811 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/787e6662-95aa-4bd6-80ce-326cf56c04e0-config-data\") pod \"787e6662-95aa-4bd6-80ce-326cf56c04e0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.154930 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/787e6662-95aa-4bd6-80ce-326cf56c04e0-logs\") pod \"787e6662-95aa-4bd6-80ce-326cf56c04e0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.155016 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/787e6662-95aa-4bd6-80ce-326cf56c04e0-combined-ca-bundle\") pod \"787e6662-95aa-4bd6-80ce-326cf56c04e0\" (UID: \"787e6662-95aa-4bd6-80ce-326cf56c04e0\") " Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.155719 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/787e6662-95aa-4bd6-80ce-326cf56c04e0-logs" (OuterVolumeSpecName: "logs") pod "787e6662-95aa-4bd6-80ce-326cf56c04e0" (UID: "787e6662-95aa-4bd6-80ce-326cf56c04e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.160509 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/787e6662-95aa-4bd6-80ce-326cf56c04e0-kube-api-access-wtgrj" (OuterVolumeSpecName: "kube-api-access-wtgrj") pod "787e6662-95aa-4bd6-80ce-326cf56c04e0" (UID: "787e6662-95aa-4bd6-80ce-326cf56c04e0"). InnerVolumeSpecName "kube-api-access-wtgrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.184147 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/787e6662-95aa-4bd6-80ce-326cf56c04e0-config-data" (OuterVolumeSpecName: "config-data") pod "787e6662-95aa-4bd6-80ce-326cf56c04e0" (UID: "787e6662-95aa-4bd6-80ce-326cf56c04e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.190104 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/787e6662-95aa-4bd6-80ce-326cf56c04e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "787e6662-95aa-4bd6-80ce-326cf56c04e0" (UID: "787e6662-95aa-4bd6-80ce-326cf56c04e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.257467 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/787e6662-95aa-4bd6-80ce-326cf56c04e0-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.257500 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/787e6662-95aa-4bd6-80ce-326cf56c04e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.257511 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtgrj\" (UniqueName: \"kubernetes.io/projected/787e6662-95aa-4bd6-80ce-326cf56c04e0-kube-api-access-wtgrj\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.257521 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/787e6662-95aa-4bd6-80ce-326cf56c04e0-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.485734 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9","Type":"ContainerStarted","Data":"3afc52f24001ca19c30451080a22a7e1efba0020efd2f70efaf92a98646c0a4c"} Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.488885 4743 generic.go:334] "Generic (PLEG): container finished" podID="787e6662-95aa-4bd6-80ce-326cf56c04e0" containerID="21057c5aff1571da9a4a1f5fc0b3f3879577b95e367dffd106200452222345f9" exitCode=0 Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.488941 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.488942 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"787e6662-95aa-4bd6-80ce-326cf56c04e0","Type":"ContainerDied","Data":"21057c5aff1571da9a4a1f5fc0b3f3879577b95e367dffd106200452222345f9"} Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.489085 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"787e6662-95aa-4bd6-80ce-326cf56c04e0","Type":"ContainerDied","Data":"b725202e4a6dc3c9474adfaf7b0c84e464c086e7dafbcd684b6b5010bf7438ca"} Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.489110 4743 scope.go:117] "RemoveContainer" containerID="21057c5aff1571da9a4a1f5fc0b3f3879577b95e367dffd106200452222345f9" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.518246 4743 scope.go:117] "RemoveContainer" containerID="936e6951806bc789f5c002a307de2d09b6ebf0076e009ed64b32d3878d94ac64" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.523544 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.538216 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.550832 4743 scope.go:117] "RemoveContainer" containerID="21057c5aff1571da9a4a1f5fc0b3f3879577b95e367dffd106200452222345f9" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.552904 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 16:18:26 crc kubenswrapper[4743]: E1125 16:18:26.553394 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="787e6662-95aa-4bd6-80ce-326cf56c04e0" containerName="nova-api-log" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.553413 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="787e6662-95aa-4bd6-80ce-326cf56c04e0" containerName="nova-api-log" Nov 25 16:18:26 crc kubenswrapper[4743]: E1125 16:18:26.553448 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="787e6662-95aa-4bd6-80ce-326cf56c04e0" containerName="nova-api-api" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.553455 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="787e6662-95aa-4bd6-80ce-326cf56c04e0" containerName="nova-api-api" Nov 25 16:18:26 crc kubenswrapper[4743]: E1125 16:18:26.553469 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21057c5aff1571da9a4a1f5fc0b3f3879577b95e367dffd106200452222345f9\": container with ID starting with 21057c5aff1571da9a4a1f5fc0b3f3879577b95e367dffd106200452222345f9 not found: ID does not exist" containerID="21057c5aff1571da9a4a1f5fc0b3f3879577b95e367dffd106200452222345f9" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.553511 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21057c5aff1571da9a4a1f5fc0b3f3879577b95e367dffd106200452222345f9"} err="failed to get container status \"21057c5aff1571da9a4a1f5fc0b3f3879577b95e367dffd106200452222345f9\": rpc error: code = NotFound desc = could not find container \"21057c5aff1571da9a4a1f5fc0b3f3879577b95e367dffd106200452222345f9\": container with ID starting with 21057c5aff1571da9a4a1f5fc0b3f3879577b95e367dffd106200452222345f9 not found: ID does not exist" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.553542 4743 scope.go:117] "RemoveContainer" containerID="936e6951806bc789f5c002a307de2d09b6ebf0076e009ed64b32d3878d94ac64" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.553648 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="787e6662-95aa-4bd6-80ce-326cf56c04e0" containerName="nova-api-log" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.553668 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="787e6662-95aa-4bd6-80ce-326cf56c04e0" containerName="nova-api-api" Nov 25 16:18:26 crc kubenswrapper[4743]: E1125 16:18:26.553924 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"936e6951806bc789f5c002a307de2d09b6ebf0076e009ed64b32d3878d94ac64\": container with ID starting with 936e6951806bc789f5c002a307de2d09b6ebf0076e009ed64b32d3878d94ac64 not found: ID does not exist" containerID="936e6951806bc789f5c002a307de2d09b6ebf0076e009ed64b32d3878d94ac64" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.553950 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"936e6951806bc789f5c002a307de2d09b6ebf0076e009ed64b32d3878d94ac64"} err="failed to get container status \"936e6951806bc789f5c002a307de2d09b6ebf0076e009ed64b32d3878d94ac64\": rpc error: code = NotFound desc = could not find container \"936e6951806bc789f5c002a307de2d09b6ebf0076e009ed64b32d3878d94ac64\": container with ID starting with 936e6951806bc789f5c002a307de2d09b6ebf0076e009ed64b32d3878d94ac64 not found: ID does not exist" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.554578 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.557486 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.557868 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.558269 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.564919 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.664258 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.664336 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78mcb\" (UniqueName: \"kubernetes.io/projected/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-kube-api-access-78mcb\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.664371 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.664417 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-public-tls-certs\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.664454 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-config-data\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.664645 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-logs\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.767304 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.767385 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78mcb\" (UniqueName: \"kubernetes.io/projected/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-kube-api-access-78mcb\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.767424 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.767467 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-public-tls-certs\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.767498 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-config-data\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.767527 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-logs\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.768123 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-logs\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.772898 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.773418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.777299 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-public-tls-certs\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.779248 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-config-data\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.790222 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78mcb\" (UniqueName: \"kubernetes.io/projected/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-kube-api-access-78mcb\") pod \"nova-api-0\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " pod="openstack/nova-api-0" Nov 25 16:18:26 crc kubenswrapper[4743]: I1125 16:18:26.899294 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 16:18:27 crc kubenswrapper[4743]: I1125 16:18:27.384412 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:18:27 crc kubenswrapper[4743]: W1125 16:18:27.390945 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda333b8aa_cc8d_4787_a6a5_2387945f5c6b.slice/crio-6842f8dde513f894ce6b78511b293113782645bcb4fd26eaae1393b6982eeef2 WatchSource:0}: Error finding container 6842f8dde513f894ce6b78511b293113782645bcb4fd26eaae1393b6982eeef2: Status 404 returned error can't find the container with id 6842f8dde513f894ce6b78511b293113782645bcb4fd26eaae1393b6982eeef2 Nov 25 16:18:27 crc kubenswrapper[4743]: I1125 16:18:27.499318 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a333b8aa-cc8d-4787-a6a5-2387945f5c6b","Type":"ContainerStarted","Data":"6842f8dde513f894ce6b78511b293113782645bcb4fd26eaae1393b6982eeef2"} Nov 25 16:18:27 crc kubenswrapper[4743]: I1125 16:18:27.723950 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:27 crc kubenswrapper[4743]: I1125 16:18:27.747920 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:27 crc kubenswrapper[4743]: I1125 16:18:27.835606 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="787e6662-95aa-4bd6-80ce-326cf56c04e0" path="/var/lib/kubelet/pods/787e6662-95aa-4bd6-80ce-326cf56c04e0/volumes" Nov 25 16:18:28 crc kubenswrapper[4743]: I1125 16:18:28.516989 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a333b8aa-cc8d-4787-a6a5-2387945f5c6b","Type":"ContainerStarted","Data":"76353db400451396d22794dd7a192d29f3365babf5d19e8e5639a31992b86e4f"} Nov 25 16:18:28 crc kubenswrapper[4743]: I1125 16:18:28.517440 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a333b8aa-cc8d-4787-a6a5-2387945f5c6b","Type":"ContainerStarted","Data":"03957f761983b6b211e150e79cf1bfbb6e680d7eced0483660d84ecf88b4b078"} Nov 25 16:18:28 crc kubenswrapper[4743]: I1125 16:18:28.522651 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9","Type":"ContainerStarted","Data":"8aaeac91567ec44ef2dd6659147c4b1ccdf16e44751efc2fd056a6d8d62fd0db"} Nov 25 16:18:28 crc kubenswrapper[4743]: I1125 16:18:28.544469 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.544453061 podStartE2EDuration="2.544453061s" podCreationTimestamp="2025-11-25 16:18:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:18:28.540679062 +0000 UTC m=+1187.662518611" watchObservedRunningTime="2025-11-25 16:18:28.544453061 +0000 UTC m=+1187.666292610" Nov 25 16:18:28 crc kubenswrapper[4743]: I1125 16:18:28.547518 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 25 16:18:28 crc kubenswrapper[4743]: I1125 16:18:28.885280 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-q2cg7"] Nov 25 16:18:28 crc kubenswrapper[4743]: I1125 16:18:28.887076 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:28 crc kubenswrapper[4743]: I1125 16:18:28.895549 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 25 16:18:28 crc kubenswrapper[4743]: I1125 16:18:28.898269 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q2cg7"] Nov 25 16:18:28 crc kubenswrapper[4743]: I1125 16:18:28.903151 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.049168 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsl8r\" (UniqueName: \"kubernetes.io/projected/cd28a285-f7b2-4b46-99d5-bf60741558f6-kube-api-access-bsl8r\") pod \"nova-cell1-cell-mapping-q2cg7\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.049377 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q2cg7\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.049615 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-scripts\") pod \"nova-cell1-cell-mapping-q2cg7\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.050015 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-config-data\") pod \"nova-cell1-cell-mapping-q2cg7\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.151504 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-config-data\") pod \"nova-cell1-cell-mapping-q2cg7\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.151633 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsl8r\" (UniqueName: \"kubernetes.io/projected/cd28a285-f7b2-4b46-99d5-bf60741558f6-kube-api-access-bsl8r\") pod \"nova-cell1-cell-mapping-q2cg7\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.151667 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q2cg7\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.151705 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-scripts\") pod \"nova-cell1-cell-mapping-q2cg7\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.158529 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-config-data\") pod \"nova-cell1-cell-mapping-q2cg7\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.175196 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q2cg7\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.177926 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-scripts\") pod \"nova-cell1-cell-mapping-q2cg7\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.183193 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsl8r\" (UniqueName: \"kubernetes.io/projected/cd28a285-f7b2-4b46-99d5-bf60741558f6-kube-api-access-bsl8r\") pod \"nova-cell1-cell-mapping-q2cg7\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.390941 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.556789 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9","Type":"ContainerStarted","Data":"d6ba2bc1c09d5a805c44a82ab20eec6df2a0291b5185fb6ed6ca855c3f733f22"} Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.881726 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q2cg7"] Nov 25 16:18:29 crc kubenswrapper[4743]: W1125 16:18:29.883445 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd28a285_f7b2_4b46_99d5_bf60741558f6.slice/crio-b091c378cc6ce70e81241da5f957c984cc907d31d64101cffad9b2fe41679bfb WatchSource:0}: Error finding container b091c378cc6ce70e81241da5f957c984cc907d31d64101cffad9b2fe41679bfb: Status 404 returned error can't find the container with id b091c378cc6ce70e81241da5f957c984cc907d31d64101cffad9b2fe41679bfb Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.918768 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.995815 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-2d76n"] Nov 25 16:18:29 crc kubenswrapper[4743]: I1125 16:18:29.996093 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-2d76n" podUID="5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" containerName="dnsmasq-dns" containerID="cri-o://7f29edb941d93fb623ce9677aecd6ad754dcf097395097dab9706bfc3e4b1c0c" gracePeriod=10 Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.575502 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q2cg7" event={"ID":"cd28a285-f7b2-4b46-99d5-bf60741558f6","Type":"ContainerStarted","Data":"2faade60b8f184654a01cfffc8fe45de0a51d1a1d30e824fd896206cd5538749"} Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.578069 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q2cg7" event={"ID":"cd28a285-f7b2-4b46-99d5-bf60741558f6","Type":"ContainerStarted","Data":"b091c378cc6ce70e81241da5f957c984cc907d31d64101cffad9b2fe41679bfb"} Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.584793 4743 generic.go:334] "Generic (PLEG): container finished" podID="5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" containerID="7f29edb941d93fb623ce9677aecd6ad754dcf097395097dab9706bfc3e4b1c0c" exitCode=0 Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.584840 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-2d76n" event={"ID":"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d","Type":"ContainerDied","Data":"7f29edb941d93fb623ce9677aecd6ad754dcf097395097dab9706bfc3e4b1c0c"} Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.605151 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-q2cg7" podStartSLOduration=2.605125148 podStartE2EDuration="2.605125148s" podCreationTimestamp="2025-11-25 16:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:18:30.596382393 +0000 UTC m=+1189.718221932" watchObservedRunningTime="2025-11-25 16:18:30.605125148 +0000 UTC m=+1189.726964697" Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.679361 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.804310 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-config\") pod \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.804668 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-dns-svc\") pod \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.804974 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-ovsdbserver-nb\") pod \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.805322 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-dns-swift-storage-0\") pod \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.805529 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-ovsdbserver-sb\") pod \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.805661 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cz95\" (UniqueName: \"kubernetes.io/projected/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-kube-api-access-7cz95\") pod \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\" (UID: \"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d\") " Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.811077 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-kube-api-access-7cz95" (OuterVolumeSpecName: "kube-api-access-7cz95") pod "5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" (UID: "5ac2c45f-7d0b-4af7-80a3-e4e0913f330d"). InnerVolumeSpecName "kube-api-access-7cz95". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.857539 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" (UID: "5ac2c45f-7d0b-4af7-80a3-e4e0913f330d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.861793 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" (UID: "5ac2c45f-7d0b-4af7-80a3-e4e0913f330d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.864006 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" (UID: "5ac2c45f-7d0b-4af7-80a3-e4e0913f330d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.869749 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" (UID: "5ac2c45f-7d0b-4af7-80a3-e4e0913f330d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.874694 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-config" (OuterVolumeSpecName: "config") pod "5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" (UID: "5ac2c45f-7d0b-4af7-80a3-e4e0913f330d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.908181 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.908216 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.908227 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.908235 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cz95\" (UniqueName: \"kubernetes.io/projected/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-kube-api-access-7cz95\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.908246 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:30 crc kubenswrapper[4743]: I1125 16:18:30.908254 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:31 crc kubenswrapper[4743]: I1125 16:18:31.600841 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9","Type":"ContainerStarted","Data":"0bddbc7167c39bdba6e98da9ec5b9bde727dc0ea0462e1a2f650cdbf02d4baf4"} Nov 25 16:18:31 crc kubenswrapper[4743]: I1125 16:18:31.601399 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 16:18:31 crc kubenswrapper[4743]: I1125 16:18:31.604401 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-2d76n" Nov 25 16:18:31 crc kubenswrapper[4743]: I1125 16:18:31.604562 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-2d76n" event={"ID":"5ac2c45f-7d0b-4af7-80a3-e4e0913f330d","Type":"ContainerDied","Data":"030b7a595ef60fb2184dff57406fe15d960d830437946d4c91a6f8ea8fe1502b"} Nov 25 16:18:31 crc kubenswrapper[4743]: I1125 16:18:31.604626 4743 scope.go:117] "RemoveContainer" containerID="7f29edb941d93fb623ce9677aecd6ad754dcf097395097dab9706bfc3e4b1c0c" Nov 25 16:18:31 crc kubenswrapper[4743]: I1125 16:18:31.628378 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.518889195 podStartE2EDuration="7.628351478s" podCreationTimestamp="2025-11-25 16:18:24 +0000 UTC" firstStartedPulling="2025-11-25 16:18:25.333205705 +0000 UTC m=+1184.455045294" lastFinishedPulling="2025-11-25 16:18:30.442668028 +0000 UTC m=+1189.564507577" observedRunningTime="2025-11-25 16:18:31.622258336 +0000 UTC m=+1190.744097905" watchObservedRunningTime="2025-11-25 16:18:31.628351478 +0000 UTC m=+1190.750191027" Nov 25 16:18:31 crc kubenswrapper[4743]: I1125 16:18:31.643951 4743 scope.go:117] "RemoveContainer" containerID="3038fdd4d91301fa79578c6d00782348371e591df33c9757488a890ac3215783" Nov 25 16:18:31 crc kubenswrapper[4743]: I1125 16:18:31.656011 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-2d76n"] Nov 25 16:18:31 crc kubenswrapper[4743]: I1125 16:18:31.665988 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-2d76n"] Nov 25 16:18:31 crc kubenswrapper[4743]: I1125 16:18:31.787409 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" path="/var/lib/kubelet/pods/5ac2c45f-7d0b-4af7-80a3-e4e0913f330d/volumes" Nov 25 16:18:35 crc kubenswrapper[4743]: I1125 16:18:35.400211 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-757b4f8459-2d76n" podUID="5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.186:5353: i/o timeout" Nov 25 16:18:35 crc kubenswrapper[4743]: I1125 16:18:35.651067 4743 generic.go:334] "Generic (PLEG): container finished" podID="cd28a285-f7b2-4b46-99d5-bf60741558f6" containerID="2faade60b8f184654a01cfffc8fe45de0a51d1a1d30e824fd896206cd5538749" exitCode=0 Nov 25 16:18:35 crc kubenswrapper[4743]: I1125 16:18:35.651120 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q2cg7" event={"ID":"cd28a285-f7b2-4b46-99d5-bf60741558f6","Type":"ContainerDied","Data":"2faade60b8f184654a01cfffc8fe45de0a51d1a1d30e824fd896206cd5538749"} Nov 25 16:18:36 crc kubenswrapper[4743]: I1125 16:18:36.899980 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 16:18:36 crc kubenswrapper[4743]: I1125 16:18:36.900295 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.057421 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.151449 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-scripts\") pod \"cd28a285-f7b2-4b46-99d5-bf60741558f6\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.151505 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-config-data\") pod \"cd28a285-f7b2-4b46-99d5-bf60741558f6\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.151632 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsl8r\" (UniqueName: \"kubernetes.io/projected/cd28a285-f7b2-4b46-99d5-bf60741558f6-kube-api-access-bsl8r\") pod \"cd28a285-f7b2-4b46-99d5-bf60741558f6\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.151743 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-combined-ca-bundle\") pod \"cd28a285-f7b2-4b46-99d5-bf60741558f6\" (UID: \"cd28a285-f7b2-4b46-99d5-bf60741558f6\") " Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.157192 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-scripts" (OuterVolumeSpecName: "scripts") pod "cd28a285-f7b2-4b46-99d5-bf60741558f6" (UID: "cd28a285-f7b2-4b46-99d5-bf60741558f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.157961 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd28a285-f7b2-4b46-99d5-bf60741558f6-kube-api-access-bsl8r" (OuterVolumeSpecName: "kube-api-access-bsl8r") pod "cd28a285-f7b2-4b46-99d5-bf60741558f6" (UID: "cd28a285-f7b2-4b46-99d5-bf60741558f6"). InnerVolumeSpecName "kube-api-access-bsl8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.180550 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd28a285-f7b2-4b46-99d5-bf60741558f6" (UID: "cd28a285-f7b2-4b46-99d5-bf60741558f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.189093 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-config-data" (OuterVolumeSpecName: "config-data") pod "cd28a285-f7b2-4b46-99d5-bf60741558f6" (UID: "cd28a285-f7b2-4b46-99d5-bf60741558f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.254392 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.254427 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.254437 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd28a285-f7b2-4b46-99d5-bf60741558f6-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.254446 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsl8r\" (UniqueName: \"kubernetes.io/projected/cd28a285-f7b2-4b46-99d5-bf60741558f6-kube-api-access-bsl8r\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.676583 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q2cg7" event={"ID":"cd28a285-f7b2-4b46-99d5-bf60741558f6","Type":"ContainerDied","Data":"b091c378cc6ce70e81241da5f957c984cc907d31d64101cffad9b2fe41679bfb"} Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.676893 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b091c378cc6ce70e81241da5f957c984cc907d31d64101cffad9b2fe41679bfb" Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.676809 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q2cg7" Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.848257 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.848677 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a333b8aa-cc8d-4787-a6a5-2387945f5c6b" containerName="nova-api-log" containerID="cri-o://03957f761983b6b211e150e79cf1bfbb6e680d7eced0483660d84ecf88b4b078" gracePeriod=30 Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.848821 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a333b8aa-cc8d-4787-a6a5-2387945f5c6b" containerName="nova-api-api" containerID="cri-o://76353db400451396d22794dd7a192d29f3365babf5d19e8e5639a31992b86e4f" gracePeriod=30 Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.858846 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a333b8aa-cc8d-4787-a6a5-2387945f5c6b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.197:8774/\": EOF" Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.858979 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a333b8aa-cc8d-4787-a6a5-2387945f5c6b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.197:8774/\": EOF" Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.879650 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.880021 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e751348a-d82f-4927-8455-9e2e58468b60" containerName="nova-scheduler-scheduler" containerID="cri-o://2709003249e749553b1028d1723c6db3b22d7af440ea94e4c2a33fe1d3f0c73e" gracePeriod=30 Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.963506 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.963794 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" containerName="nova-metadata-log" containerID="cri-o://33ead0ca9162468c43f78ce7d90b8500bd26be949d79154b77a9e94103ae7de3" gracePeriod=30 Nov 25 16:18:37 crc kubenswrapper[4743]: I1125 16:18:37.964358 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" containerName="nova-metadata-metadata" containerID="cri-o://344b618bfb5b5cc568e87479f8d51655cba2f1f945a544d7f62c20acc79527ab" gracePeriod=30 Nov 25 16:18:38 crc kubenswrapper[4743]: I1125 16:18:38.686242 4743 generic.go:334] "Generic (PLEG): container finished" podID="c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" containerID="33ead0ca9162468c43f78ce7d90b8500bd26be949d79154b77a9e94103ae7de3" exitCode=143 Nov 25 16:18:38 crc kubenswrapper[4743]: I1125 16:18:38.686331 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6","Type":"ContainerDied","Data":"33ead0ca9162468c43f78ce7d90b8500bd26be949d79154b77a9e94103ae7de3"} Nov 25 16:18:38 crc kubenswrapper[4743]: I1125 16:18:38.688095 4743 generic.go:334] "Generic (PLEG): container finished" podID="a333b8aa-cc8d-4787-a6a5-2387945f5c6b" containerID="03957f761983b6b211e150e79cf1bfbb6e680d7eced0483660d84ecf88b4b078" exitCode=143 Nov 25 16:18:38 crc kubenswrapper[4743]: I1125 16:18:38.688115 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a333b8aa-cc8d-4787-a6a5-2387945f5c6b","Type":"ContainerDied","Data":"03957f761983b6b211e150e79cf1bfbb6e680d7eced0483660d84ecf88b4b078"} Nov 25 16:18:41 crc kubenswrapper[4743]: I1125 16:18:41.106518 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": read tcp 10.217.0.2:42586->10.217.0.190:8775: read: connection reset by peer" Nov 25 16:18:41 crc kubenswrapper[4743]: I1125 16:18:41.106566 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": read tcp 10.217.0.2:42588->10.217.0.190:8775: read: connection reset by peer" Nov 25 16:18:41 crc kubenswrapper[4743]: I1125 16:18:41.726627 4743 generic.go:334] "Generic (PLEG): container finished" podID="c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" containerID="344b618bfb5b5cc568e87479f8d51655cba2f1f945a544d7f62c20acc79527ab" exitCode=0 Nov 25 16:18:41 crc kubenswrapper[4743]: I1125 16:18:41.727131 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6","Type":"ContainerDied","Data":"344b618bfb5b5cc568e87479f8d51655cba2f1f945a544d7f62c20acc79527ab"} Nov 25 16:18:41 crc kubenswrapper[4743]: I1125 16:18:41.729936 4743 generic.go:334] "Generic (PLEG): container finished" podID="e751348a-d82f-4927-8455-9e2e58468b60" containerID="2709003249e749553b1028d1723c6db3b22d7af440ea94e4c2a33fe1d3f0c73e" exitCode=0 Nov 25 16:18:41 crc kubenswrapper[4743]: I1125 16:18:41.730016 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e751348a-d82f-4927-8455-9e2e58468b60","Type":"ContainerDied","Data":"2709003249e749553b1028d1723c6db3b22d7af440ea94e4c2a33fe1d3f0c73e"} Nov 25 16:18:41 crc kubenswrapper[4743]: I1125 16:18:41.925027 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.057286 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e751348a-d82f-4927-8455-9e2e58468b60-config-data\") pod \"e751348a-d82f-4927-8455-9e2e58468b60\" (UID: \"e751348a-d82f-4927-8455-9e2e58468b60\") " Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.057339 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e751348a-d82f-4927-8455-9e2e58468b60-combined-ca-bundle\") pod \"e751348a-d82f-4927-8455-9e2e58468b60\" (UID: \"e751348a-d82f-4927-8455-9e2e58468b60\") " Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.057371 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rk5r\" (UniqueName: \"kubernetes.io/projected/e751348a-d82f-4927-8455-9e2e58468b60-kube-api-access-5rk5r\") pod \"e751348a-d82f-4927-8455-9e2e58468b60\" (UID: \"e751348a-d82f-4927-8455-9e2e58468b60\") " Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.065891 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e751348a-d82f-4927-8455-9e2e58468b60-kube-api-access-5rk5r" (OuterVolumeSpecName: "kube-api-access-5rk5r") pod "e751348a-d82f-4927-8455-9e2e58468b60" (UID: "e751348a-d82f-4927-8455-9e2e58468b60"). InnerVolumeSpecName "kube-api-access-5rk5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.067909 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.104707 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e751348a-d82f-4927-8455-9e2e58468b60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e751348a-d82f-4927-8455-9e2e58468b60" (UID: "e751348a-d82f-4927-8455-9e2e58468b60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.116818 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e751348a-d82f-4927-8455-9e2e58468b60-config-data" (OuterVolumeSpecName: "config-data") pod "e751348a-d82f-4927-8455-9e2e58468b60" (UID: "e751348a-d82f-4927-8455-9e2e58468b60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.160636 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e751348a-d82f-4927-8455-9e2e58468b60-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.161062 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e751348a-d82f-4927-8455-9e2e58468b60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.161083 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rk5r\" (UniqueName: \"kubernetes.io/projected/e751348a-d82f-4927-8455-9e2e58468b60-kube-api-access-5rk5r\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.261921 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-logs\") pod \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.262018 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-config-data\") pod \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.262047 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvjgd\" (UniqueName: \"kubernetes.io/projected/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-kube-api-access-vvjgd\") pod \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.262100 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-nova-metadata-tls-certs\") pod \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.262153 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-combined-ca-bundle\") pod \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\" (UID: \"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6\") " Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.262748 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-logs" (OuterVolumeSpecName: "logs") pod "c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" (UID: "c9d26c17-1d0d-4266-87ce-ef8576d9a1e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.266888 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-kube-api-access-vvjgd" (OuterVolumeSpecName: "kube-api-access-vvjgd") pod "c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" (UID: "c9d26c17-1d0d-4266-87ce-ef8576d9a1e6"). InnerVolumeSpecName "kube-api-access-vvjgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.294759 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" (UID: "c9d26c17-1d0d-4266-87ce-ef8576d9a1e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.312762 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-config-data" (OuterVolumeSpecName: "config-data") pod "c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" (UID: "c9d26c17-1d0d-4266-87ce-ef8576d9a1e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.333791 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" (UID: "c9d26c17-1d0d-4266-87ce-ef8576d9a1e6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.365268 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.365331 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.365351 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.365372 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.365396 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvjgd\" (UniqueName: \"kubernetes.io/projected/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6-kube-api-access-vvjgd\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.739949 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9d26c17-1d0d-4266-87ce-ef8576d9a1e6","Type":"ContainerDied","Data":"deaaf9e8f2a2c8a58491b1dfa091cfb9d621b393209c7639a80c6c719c844a87"} Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.740016 4743 scope.go:117] "RemoveContainer" containerID="344b618bfb5b5cc568e87479f8d51655cba2f1f945a544d7f62c20acc79527ab" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.739970 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.743056 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e751348a-d82f-4927-8455-9e2e58468b60","Type":"ContainerDied","Data":"48b4a13a85850b1248f5e3a832b0340ac13e764471a74bcf896e5b2e173f076c"} Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.743091 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.761537 4743 scope.go:117] "RemoveContainer" containerID="33ead0ca9162468c43f78ce7d90b8500bd26be949d79154b77a9e94103ae7de3" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.776333 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.780976 4743 scope.go:117] "RemoveContainer" containerID="2709003249e749553b1028d1723c6db3b22d7af440ea94e4c2a33fe1d3f0c73e" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.785454 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.794714 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.805298 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.825244 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 16:18:42 crc kubenswrapper[4743]: E1125 16:18:42.825724 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" containerName="init" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.825746 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" containerName="init" Nov 25 16:18:42 crc kubenswrapper[4743]: E1125 16:18:42.825758 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e751348a-d82f-4927-8455-9e2e58468b60" containerName="nova-scheduler-scheduler" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.825768 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e751348a-d82f-4927-8455-9e2e58468b60" containerName="nova-scheduler-scheduler" Nov 25 16:18:42 crc kubenswrapper[4743]: E1125 16:18:42.825794 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" containerName="nova-metadata-metadata" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.825803 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" containerName="nova-metadata-metadata" Nov 25 16:18:42 crc kubenswrapper[4743]: E1125 16:18:42.825827 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd28a285-f7b2-4b46-99d5-bf60741558f6" containerName="nova-manage" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.825835 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd28a285-f7b2-4b46-99d5-bf60741558f6" containerName="nova-manage" Nov 25 16:18:42 crc kubenswrapper[4743]: E1125 16:18:42.825848 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" containerName="nova-metadata-log" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.825855 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" containerName="nova-metadata-log" Nov 25 16:18:42 crc kubenswrapper[4743]: E1125 16:18:42.825875 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" containerName="dnsmasq-dns" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.825882 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" containerName="dnsmasq-dns" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.826091 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac2c45f-7d0b-4af7-80a3-e4e0913f330d" containerName="dnsmasq-dns" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.826112 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" containerName="nova-metadata-log" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.826129 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" containerName="nova-metadata-metadata" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.826150 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd28a285-f7b2-4b46-99d5-bf60741558f6" containerName="nova-manage" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.826163 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e751348a-d82f-4927-8455-9e2e58468b60" containerName="nova-scheduler-scheduler" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.826936 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.832657 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.834519 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.840915 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.842882 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.845480 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.845789 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.854458 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.976554 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkhpr\" (UniqueName: \"kubernetes.io/projected/5f82a83d-3847-490f-b9dd-5dda26140b80-kube-api-access-mkhpr\") pod \"nova-scheduler-0\" (UID: \"5f82a83d-3847-490f-b9dd-5dda26140b80\") " pod="openstack/nova-scheduler-0" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.976672 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f82a83d-3847-490f-b9dd-5dda26140b80-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f82a83d-3847-490f-b9dd-5dda26140b80\") " pod="openstack/nova-scheduler-0" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.977221 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b78218b-03ac-4dbb-89cf-58580f5367d3-logs\") pod \"nova-metadata-0\" (UID: \"5b78218b-03ac-4dbb-89cf-58580f5367d3\") " pod="openstack/nova-metadata-0" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.977284 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b78218b-03ac-4dbb-89cf-58580f5367d3-config-data\") pod \"nova-metadata-0\" (UID: \"5b78218b-03ac-4dbb-89cf-58580f5367d3\") " pod="openstack/nova-metadata-0" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.977370 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f82a83d-3847-490f-b9dd-5dda26140b80-config-data\") pod \"nova-scheduler-0\" (UID: \"5f82a83d-3847-490f-b9dd-5dda26140b80\") " pod="openstack/nova-scheduler-0" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.977422 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4snm\" (UniqueName: \"kubernetes.io/projected/5b78218b-03ac-4dbb-89cf-58580f5367d3-kube-api-access-l4snm\") pod \"nova-metadata-0\" (UID: \"5b78218b-03ac-4dbb-89cf-58580f5367d3\") " pod="openstack/nova-metadata-0" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.977471 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b78218b-03ac-4dbb-89cf-58580f5367d3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5b78218b-03ac-4dbb-89cf-58580f5367d3\") " pod="openstack/nova-metadata-0" Nov 25 16:18:42 crc kubenswrapper[4743]: I1125 16:18:42.977578 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b78218b-03ac-4dbb-89cf-58580f5367d3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5b78218b-03ac-4dbb-89cf-58580f5367d3\") " pod="openstack/nova-metadata-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.079319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b78218b-03ac-4dbb-89cf-58580f5367d3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5b78218b-03ac-4dbb-89cf-58580f5367d3\") " pod="openstack/nova-metadata-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.079379 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkhpr\" (UniqueName: \"kubernetes.io/projected/5f82a83d-3847-490f-b9dd-5dda26140b80-kube-api-access-mkhpr\") pod \"nova-scheduler-0\" (UID: \"5f82a83d-3847-490f-b9dd-5dda26140b80\") " pod="openstack/nova-scheduler-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.079412 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f82a83d-3847-490f-b9dd-5dda26140b80-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f82a83d-3847-490f-b9dd-5dda26140b80\") " pod="openstack/nova-scheduler-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.079431 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b78218b-03ac-4dbb-89cf-58580f5367d3-logs\") pod \"nova-metadata-0\" (UID: \"5b78218b-03ac-4dbb-89cf-58580f5367d3\") " pod="openstack/nova-metadata-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.079461 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b78218b-03ac-4dbb-89cf-58580f5367d3-config-data\") pod \"nova-metadata-0\" (UID: \"5b78218b-03ac-4dbb-89cf-58580f5367d3\") " pod="openstack/nova-metadata-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.079491 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f82a83d-3847-490f-b9dd-5dda26140b80-config-data\") pod \"nova-scheduler-0\" (UID: \"5f82a83d-3847-490f-b9dd-5dda26140b80\") " pod="openstack/nova-scheduler-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.079517 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4snm\" (UniqueName: \"kubernetes.io/projected/5b78218b-03ac-4dbb-89cf-58580f5367d3-kube-api-access-l4snm\") pod \"nova-metadata-0\" (UID: \"5b78218b-03ac-4dbb-89cf-58580f5367d3\") " pod="openstack/nova-metadata-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.079544 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b78218b-03ac-4dbb-89cf-58580f5367d3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5b78218b-03ac-4dbb-89cf-58580f5367d3\") " pod="openstack/nova-metadata-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.080287 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b78218b-03ac-4dbb-89cf-58580f5367d3-logs\") pod \"nova-metadata-0\" (UID: \"5b78218b-03ac-4dbb-89cf-58580f5367d3\") " pod="openstack/nova-metadata-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.083001 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f82a83d-3847-490f-b9dd-5dda26140b80-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f82a83d-3847-490f-b9dd-5dda26140b80\") " pod="openstack/nova-scheduler-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.083123 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b78218b-03ac-4dbb-89cf-58580f5367d3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5b78218b-03ac-4dbb-89cf-58580f5367d3\") " pod="openstack/nova-metadata-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.083506 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f82a83d-3847-490f-b9dd-5dda26140b80-config-data\") pod \"nova-scheduler-0\" (UID: \"5f82a83d-3847-490f-b9dd-5dda26140b80\") " pod="openstack/nova-scheduler-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.083758 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b78218b-03ac-4dbb-89cf-58580f5367d3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5b78218b-03ac-4dbb-89cf-58580f5367d3\") " pod="openstack/nova-metadata-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.084783 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b78218b-03ac-4dbb-89cf-58580f5367d3-config-data\") pod \"nova-metadata-0\" (UID: \"5b78218b-03ac-4dbb-89cf-58580f5367d3\") " pod="openstack/nova-metadata-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.095690 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkhpr\" (UniqueName: \"kubernetes.io/projected/5f82a83d-3847-490f-b9dd-5dda26140b80-kube-api-access-mkhpr\") pod \"nova-scheduler-0\" (UID: \"5f82a83d-3847-490f-b9dd-5dda26140b80\") " pod="openstack/nova-scheduler-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.098902 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4snm\" (UniqueName: \"kubernetes.io/projected/5b78218b-03ac-4dbb-89cf-58580f5367d3-kube-api-access-l4snm\") pod \"nova-metadata-0\" (UID: \"5b78218b-03ac-4dbb-89cf-58580f5367d3\") " pod="openstack/nova-metadata-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.158521 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.170855 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.609435 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 16:18:43 crc kubenswrapper[4743]: W1125 16:18:43.613873 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f82a83d_3847_490f_b9dd_5dda26140b80.slice/crio-b3ede62f2534119bb43c6db2906d12cf94a0c91432bbd8d5bad6d4f873953617 WatchSource:0}: Error finding container b3ede62f2534119bb43c6db2906d12cf94a0c91432bbd8d5bad6d4f873953617: Status 404 returned error can't find the container with id b3ede62f2534119bb43c6db2906d12cf94a0c91432bbd8d5bad6d4f873953617 Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.681682 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.752751 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b78218b-03ac-4dbb-89cf-58580f5367d3","Type":"ContainerStarted","Data":"d1218e05a04262afcd30ecf365483bf2c8f5dae729730bc4c154fb42a9faefa0"} Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.759010 4743 generic.go:334] "Generic (PLEG): container finished" podID="a333b8aa-cc8d-4787-a6a5-2387945f5c6b" containerID="76353db400451396d22794dd7a192d29f3365babf5d19e8e5639a31992b86e4f" exitCode=0 Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.759078 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a333b8aa-cc8d-4787-a6a5-2387945f5c6b","Type":"ContainerDied","Data":"76353db400451396d22794dd7a192d29f3365babf5d19e8e5639a31992b86e4f"} Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.761791 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f82a83d-3847-490f-b9dd-5dda26140b80","Type":"ContainerStarted","Data":"b3ede62f2534119bb43c6db2906d12cf94a0c91432bbd8d5bad6d4f873953617"} Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.784898 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9d26c17-1d0d-4266-87ce-ef8576d9a1e6" path="/var/lib/kubelet/pods/c9d26c17-1d0d-4266-87ce-ef8576d9a1e6/volumes" Nov 25 16:18:43 crc kubenswrapper[4743]: I1125 16:18:43.785488 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e751348a-d82f-4927-8455-9e2e58468b60" path="/var/lib/kubelet/pods/e751348a-d82f-4927-8455-9e2e58468b60/volumes" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.089448 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.200474 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-internal-tls-certs\") pod \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.200511 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-config-data\") pod \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.200543 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-logs\") pod \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.200585 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-combined-ca-bundle\") pod \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.200642 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-public-tls-certs\") pod \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.200675 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78mcb\" (UniqueName: \"kubernetes.io/projected/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-kube-api-access-78mcb\") pod \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\" (UID: \"a333b8aa-cc8d-4787-a6a5-2387945f5c6b\") " Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.202875 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-logs" (OuterVolumeSpecName: "logs") pod "a333b8aa-cc8d-4787-a6a5-2387945f5c6b" (UID: "a333b8aa-cc8d-4787-a6a5-2387945f5c6b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.206863 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-kube-api-access-78mcb" (OuterVolumeSpecName: "kube-api-access-78mcb") pod "a333b8aa-cc8d-4787-a6a5-2387945f5c6b" (UID: "a333b8aa-cc8d-4787-a6a5-2387945f5c6b"). InnerVolumeSpecName "kube-api-access-78mcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.233795 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-config-data" (OuterVolumeSpecName: "config-data") pod "a333b8aa-cc8d-4787-a6a5-2387945f5c6b" (UID: "a333b8aa-cc8d-4787-a6a5-2387945f5c6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.235778 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a333b8aa-cc8d-4787-a6a5-2387945f5c6b" (UID: "a333b8aa-cc8d-4787-a6a5-2387945f5c6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.264389 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a333b8aa-cc8d-4787-a6a5-2387945f5c6b" (UID: "a333b8aa-cc8d-4787-a6a5-2387945f5c6b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.264835 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a333b8aa-cc8d-4787-a6a5-2387945f5c6b" (UID: "a333b8aa-cc8d-4787-a6a5-2387945f5c6b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.302976 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.303025 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78mcb\" (UniqueName: \"kubernetes.io/projected/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-kube-api-access-78mcb\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.303041 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.303054 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.303067 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.303078 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a333b8aa-cc8d-4787-a6a5-2387945f5c6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.772367 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b78218b-03ac-4dbb-89cf-58580f5367d3","Type":"ContainerStarted","Data":"07c97438ac9c46662b7e491a3a8dac7cd2cb1d4c17f4034fa63cc0017d9895e7"} Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.772934 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b78218b-03ac-4dbb-89cf-58580f5367d3","Type":"ContainerStarted","Data":"04d57e0b96d200aba4f0004aff5e1b7ed338b1eb96d04f6093f7dad9436b56b8"} Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.774852 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.774854 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a333b8aa-cc8d-4787-a6a5-2387945f5c6b","Type":"ContainerDied","Data":"6842f8dde513f894ce6b78511b293113782645bcb4fd26eaae1393b6982eeef2"} Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.774906 4743 scope.go:117] "RemoveContainer" containerID="76353db400451396d22794dd7a192d29f3365babf5d19e8e5639a31992b86e4f" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.782167 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f82a83d-3847-490f-b9dd-5dda26140b80","Type":"ContainerStarted","Data":"62c11b78e2b5213df64d132035d808b5ca312ac60fdb4ee8103b9c85a25f8cf5"} Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.794267 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.794247129 podStartE2EDuration="2.794247129s" podCreationTimestamp="2025-11-25 16:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:18:44.79328481 +0000 UTC m=+1203.915124369" watchObservedRunningTime="2025-11-25 16:18:44.794247129 +0000 UTC m=+1203.916086688" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.809368 4743 scope.go:117] "RemoveContainer" containerID="03957f761983b6b211e150e79cf1bfbb6e680d7eced0483660d84ecf88b4b078" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.810782 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.8107628780000002 podStartE2EDuration="2.810762878s" podCreationTimestamp="2025-11-25 16:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:18:44.805546034 +0000 UTC m=+1203.927385583" watchObservedRunningTime="2025-11-25 16:18:44.810762878 +0000 UTC m=+1203.932602417" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.835310 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.854441 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.864547 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 16:18:44 crc kubenswrapper[4743]: E1125 16:18:44.865238 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a333b8aa-cc8d-4787-a6a5-2387945f5c6b" containerName="nova-api-log" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.865263 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a333b8aa-cc8d-4787-a6a5-2387945f5c6b" containerName="nova-api-log" Nov 25 16:18:44 crc kubenswrapper[4743]: E1125 16:18:44.865324 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a333b8aa-cc8d-4787-a6a5-2387945f5c6b" containerName="nova-api-api" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.865334 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a333b8aa-cc8d-4787-a6a5-2387945f5c6b" containerName="nova-api-api" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.865793 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a333b8aa-cc8d-4787-a6a5-2387945f5c6b" containerName="nova-api-api" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.865835 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a333b8aa-cc8d-4787-a6a5-2387945f5c6b" containerName="nova-api-log" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.867573 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.869070 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.870248 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.871408 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:18:44 crc kubenswrapper[4743]: I1125 16:18:44.874465 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.015109 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5aaab81-18e9-41e2-8db4-00c4a09b7710-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.015165 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5aaab81-18e9-41e2-8db4-00c4a09b7710-public-tls-certs\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.015224 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxzqg\" (UniqueName: \"kubernetes.io/projected/a5aaab81-18e9-41e2-8db4-00c4a09b7710-kube-api-access-kxzqg\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.015259 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5aaab81-18e9-41e2-8db4-00c4a09b7710-logs\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.015281 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5aaab81-18e9-41e2-8db4-00c4a09b7710-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.015349 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5aaab81-18e9-41e2-8db4-00c4a09b7710-config-data\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.116490 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5aaab81-18e9-41e2-8db4-00c4a09b7710-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.116536 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5aaab81-18e9-41e2-8db4-00c4a09b7710-public-tls-certs\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.116582 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxzqg\" (UniqueName: \"kubernetes.io/projected/a5aaab81-18e9-41e2-8db4-00c4a09b7710-kube-api-access-kxzqg\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.116623 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5aaab81-18e9-41e2-8db4-00c4a09b7710-logs\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.116639 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5aaab81-18e9-41e2-8db4-00c4a09b7710-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.116687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5aaab81-18e9-41e2-8db4-00c4a09b7710-config-data\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.117658 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5aaab81-18e9-41e2-8db4-00c4a09b7710-logs\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.128230 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5aaab81-18e9-41e2-8db4-00c4a09b7710-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.128283 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5aaab81-18e9-41e2-8db4-00c4a09b7710-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.128439 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5aaab81-18e9-41e2-8db4-00c4a09b7710-config-data\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.131355 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5aaab81-18e9-41e2-8db4-00c4a09b7710-public-tls-certs\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.131690 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxzqg\" (UniqueName: \"kubernetes.io/projected/a5aaab81-18e9-41e2-8db4-00c4a09b7710-kube-api-access-kxzqg\") pod \"nova-api-0\" (UID: \"a5aaab81-18e9-41e2-8db4-00c4a09b7710\") " pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.188393 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.636439 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.788408 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a333b8aa-cc8d-4787-a6a5-2387945f5c6b" path="/var/lib/kubelet/pods/a333b8aa-cc8d-4787-a6a5-2387945f5c6b/volumes" Nov 25 16:18:45 crc kubenswrapper[4743]: I1125 16:18:45.793380 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a5aaab81-18e9-41e2-8db4-00c4a09b7710","Type":"ContainerStarted","Data":"a2af420a9500364a76c80230993fa595684e35b77b9923b7ae80b81180cda900"} Nov 25 16:18:46 crc kubenswrapper[4743]: I1125 16:18:46.810153 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a5aaab81-18e9-41e2-8db4-00c4a09b7710","Type":"ContainerStarted","Data":"022db6ffea74a2bb64cc1d57d165074b53c1d81f0f28bcf825e60a01e27cc017"} Nov 25 16:18:46 crc kubenswrapper[4743]: I1125 16:18:46.811081 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a5aaab81-18e9-41e2-8db4-00c4a09b7710","Type":"ContainerStarted","Data":"fc36e80941b10e24bcfb74727547e5453b77009a983ebef21173f86eeaa98410"} Nov 25 16:18:46 crc kubenswrapper[4743]: I1125 16:18:46.837750 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.837729286 podStartE2EDuration="2.837729286s" podCreationTimestamp="2025-11-25 16:18:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:18:46.828171947 +0000 UTC m=+1205.950011516" watchObservedRunningTime="2025-11-25 16:18:46.837729286 +0000 UTC m=+1205.959568845" Nov 25 16:18:48 crc kubenswrapper[4743]: I1125 16:18:48.159228 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 25 16:18:48 crc kubenswrapper[4743]: I1125 16:18:48.171951 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 16:18:48 crc kubenswrapper[4743]: I1125 16:18:48.172029 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 16:18:53 crc kubenswrapper[4743]: I1125 16:18:53.159544 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 25 16:18:53 crc kubenswrapper[4743]: I1125 16:18:53.171261 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 16:18:53 crc kubenswrapper[4743]: I1125 16:18:53.171302 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 16:18:53 crc kubenswrapper[4743]: I1125 16:18:53.193984 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 25 16:18:54 crc kubenswrapper[4743]: I1125 16:18:54.058010 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 25 16:18:54 crc kubenswrapper[4743]: I1125 16:18:54.182854 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5b78218b-03ac-4dbb-89cf-58580f5367d3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 16:18:54 crc kubenswrapper[4743]: I1125 16:18:54.182922 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5b78218b-03ac-4dbb-89cf-58580f5367d3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 16:18:54 crc kubenswrapper[4743]: I1125 16:18:54.875553 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 25 16:18:55 crc kubenswrapper[4743]: I1125 16:18:55.189027 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 16:18:55 crc kubenswrapper[4743]: I1125 16:18:55.189078 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 16:18:56 crc kubenswrapper[4743]: I1125 16:18:56.206790 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a5aaab81-18e9-41e2-8db4-00c4a09b7710" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 16:18:56 crc kubenswrapper[4743]: I1125 16:18:56.206790 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a5aaab81-18e9-41e2-8db4-00c4a09b7710" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 16:18:58 crc kubenswrapper[4743]: I1125 16:18:58.319153 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 16:18:58 crc kubenswrapper[4743]: I1125 16:18:58.320223 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8337526c-dedb-4e1a-b73e-a9c37c6e6927" containerName="kube-state-metrics" containerID="cri-o://e810d6bbf087dfe7d94378e11b5aebf04704edbcafe9dd9e415adda896bf5952" gracePeriod=30 Nov 25 16:18:58 crc kubenswrapper[4743]: I1125 16:18:58.742977 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 16:18:58 crc kubenswrapper[4743]: I1125 16:18:58.905047 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7mzq\" (UniqueName: \"kubernetes.io/projected/8337526c-dedb-4e1a-b73e-a9c37c6e6927-kube-api-access-w7mzq\") pod \"8337526c-dedb-4e1a-b73e-a9c37c6e6927\" (UID: \"8337526c-dedb-4e1a-b73e-a9c37c6e6927\") " Nov 25 16:18:58 crc kubenswrapper[4743]: I1125 16:18:58.917803 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8337526c-dedb-4e1a-b73e-a9c37c6e6927-kube-api-access-w7mzq" (OuterVolumeSpecName: "kube-api-access-w7mzq") pod "8337526c-dedb-4e1a-b73e-a9c37c6e6927" (UID: "8337526c-dedb-4e1a-b73e-a9c37c6e6927"). InnerVolumeSpecName "kube-api-access-w7mzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.008283 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7mzq\" (UniqueName: \"kubernetes.io/projected/8337526c-dedb-4e1a-b73e-a9c37c6e6927-kube-api-access-w7mzq\") on node \"crc\" DevicePath \"\"" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.073467 4743 generic.go:334] "Generic (PLEG): container finished" podID="8337526c-dedb-4e1a-b73e-a9c37c6e6927" containerID="e810d6bbf087dfe7d94378e11b5aebf04704edbcafe9dd9e415adda896bf5952" exitCode=2 Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.073536 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8337526c-dedb-4e1a-b73e-a9c37c6e6927","Type":"ContainerDied","Data":"e810d6bbf087dfe7d94378e11b5aebf04704edbcafe9dd9e415adda896bf5952"} Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.073608 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8337526c-dedb-4e1a-b73e-a9c37c6e6927","Type":"ContainerDied","Data":"0249f6ac0c8a1da11804294780798c2fe79821f4a39d8b7198793565363dba36"} Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.073630 4743 scope.go:117] "RemoveContainer" containerID="e810d6bbf087dfe7d94378e11b5aebf04704edbcafe9dd9e415adda896bf5952" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.073874 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.103279 4743 scope.go:117] "RemoveContainer" containerID="e810d6bbf087dfe7d94378e11b5aebf04704edbcafe9dd9e415adda896bf5952" Nov 25 16:18:59 crc kubenswrapper[4743]: E1125 16:18:59.105042 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e810d6bbf087dfe7d94378e11b5aebf04704edbcafe9dd9e415adda896bf5952\": container with ID starting with e810d6bbf087dfe7d94378e11b5aebf04704edbcafe9dd9e415adda896bf5952 not found: ID does not exist" containerID="e810d6bbf087dfe7d94378e11b5aebf04704edbcafe9dd9e415adda896bf5952" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.105197 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e810d6bbf087dfe7d94378e11b5aebf04704edbcafe9dd9e415adda896bf5952"} err="failed to get container status \"e810d6bbf087dfe7d94378e11b5aebf04704edbcafe9dd9e415adda896bf5952\": rpc error: code = NotFound desc = could not find container \"e810d6bbf087dfe7d94378e11b5aebf04704edbcafe9dd9e415adda896bf5952\": container with ID starting with e810d6bbf087dfe7d94378e11b5aebf04704edbcafe9dd9e415adda896bf5952 not found: ID does not exist" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.113766 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.124692 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.133384 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 16:18:59 crc kubenswrapper[4743]: E1125 16:18:59.133875 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8337526c-dedb-4e1a-b73e-a9c37c6e6927" containerName="kube-state-metrics" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.133894 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8337526c-dedb-4e1a-b73e-a9c37c6e6927" containerName="kube-state-metrics" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.134090 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8337526c-dedb-4e1a-b73e-a9c37c6e6927" containerName="kube-state-metrics" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.134924 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.137134 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.139215 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.142081 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.313097 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa4a8c5c-3c11-45e5-815e-bebe62e1b165-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"aa4a8c5c-3c11-45e5-815e-bebe62e1b165\") " pod="openstack/kube-state-metrics-0" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.313135 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zltpf\" (UniqueName: \"kubernetes.io/projected/aa4a8c5c-3c11-45e5-815e-bebe62e1b165-kube-api-access-zltpf\") pod \"kube-state-metrics-0\" (UID: \"aa4a8c5c-3c11-45e5-815e-bebe62e1b165\") " pod="openstack/kube-state-metrics-0" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.313176 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa4a8c5c-3c11-45e5-815e-bebe62e1b165-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"aa4a8c5c-3c11-45e5-815e-bebe62e1b165\") " pod="openstack/kube-state-metrics-0" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.313291 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/aa4a8c5c-3c11-45e5-815e-bebe62e1b165-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"aa4a8c5c-3c11-45e5-815e-bebe62e1b165\") " pod="openstack/kube-state-metrics-0" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.415245 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa4a8c5c-3c11-45e5-815e-bebe62e1b165-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"aa4a8c5c-3c11-45e5-815e-bebe62e1b165\") " pod="openstack/kube-state-metrics-0" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.415298 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zltpf\" (UniqueName: \"kubernetes.io/projected/aa4a8c5c-3c11-45e5-815e-bebe62e1b165-kube-api-access-zltpf\") pod \"kube-state-metrics-0\" (UID: \"aa4a8c5c-3c11-45e5-815e-bebe62e1b165\") " pod="openstack/kube-state-metrics-0" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.415344 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa4a8c5c-3c11-45e5-815e-bebe62e1b165-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"aa4a8c5c-3c11-45e5-815e-bebe62e1b165\") " pod="openstack/kube-state-metrics-0" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.415387 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/aa4a8c5c-3c11-45e5-815e-bebe62e1b165-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"aa4a8c5c-3c11-45e5-815e-bebe62e1b165\") " pod="openstack/kube-state-metrics-0" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.419978 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/aa4a8c5c-3c11-45e5-815e-bebe62e1b165-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"aa4a8c5c-3c11-45e5-815e-bebe62e1b165\") " pod="openstack/kube-state-metrics-0" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.420128 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa4a8c5c-3c11-45e5-815e-bebe62e1b165-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"aa4a8c5c-3c11-45e5-815e-bebe62e1b165\") " pod="openstack/kube-state-metrics-0" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.420581 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa4a8c5c-3c11-45e5-815e-bebe62e1b165-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"aa4a8c5c-3c11-45e5-815e-bebe62e1b165\") " pod="openstack/kube-state-metrics-0" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.446859 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zltpf\" (UniqueName: \"kubernetes.io/projected/aa4a8c5c-3c11-45e5-815e-bebe62e1b165-kube-api-access-zltpf\") pod \"kube-state-metrics-0\" (UID: \"aa4a8c5c-3c11-45e5-815e-bebe62e1b165\") " pod="openstack/kube-state-metrics-0" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.451650 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.789066 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8337526c-dedb-4e1a-b73e-a9c37c6e6927" path="/var/lib/kubelet/pods/8337526c-dedb-4e1a-b73e-a9c37c6e6927/volumes" Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.884465 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.919315 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.919566 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="ceilometer-central-agent" containerID="cri-o://3afc52f24001ca19c30451080a22a7e1efba0020efd2f70efaf92a98646c0a4c" gracePeriod=30 Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.919657 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="proxy-httpd" containerID="cri-o://0bddbc7167c39bdba6e98da9ec5b9bde727dc0ea0462e1a2f650cdbf02d4baf4" gracePeriod=30 Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.919700 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="ceilometer-notification-agent" containerID="cri-o://8aaeac91567ec44ef2dd6659147c4b1ccdf16e44751efc2fd056a6d8d62fd0db" gracePeriod=30 Nov 25 16:18:59 crc kubenswrapper[4743]: I1125 16:18:59.919688 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="sg-core" containerID="cri-o://d6ba2bc1c09d5a805c44a82ab20eec6df2a0291b5185fb6ed6ca855c3f733f22" gracePeriod=30 Nov 25 16:19:00 crc kubenswrapper[4743]: I1125 16:19:00.089458 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"aa4a8c5c-3c11-45e5-815e-bebe62e1b165","Type":"ContainerStarted","Data":"a5488d5ee808027a3784683c9d89f9c174b61a4cceebdb414e5ba8ada1ffeea0"} Nov 25 16:19:00 crc kubenswrapper[4743]: I1125 16:19:00.092213 4743 generic.go:334] "Generic (PLEG): container finished" podID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerID="0bddbc7167c39bdba6e98da9ec5b9bde727dc0ea0462e1a2f650cdbf02d4baf4" exitCode=0 Nov 25 16:19:00 crc kubenswrapper[4743]: I1125 16:19:00.092243 4743 generic.go:334] "Generic (PLEG): container finished" podID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerID="d6ba2bc1c09d5a805c44a82ab20eec6df2a0291b5185fb6ed6ca855c3f733f22" exitCode=2 Nov 25 16:19:00 crc kubenswrapper[4743]: I1125 16:19:00.092299 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9","Type":"ContainerDied","Data":"0bddbc7167c39bdba6e98da9ec5b9bde727dc0ea0462e1a2f650cdbf02d4baf4"} Nov 25 16:19:00 crc kubenswrapper[4743]: I1125 16:19:00.092324 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9","Type":"ContainerDied","Data":"d6ba2bc1c09d5a805c44a82ab20eec6df2a0291b5185fb6ed6ca855c3f733f22"} Nov 25 16:19:01 crc kubenswrapper[4743]: I1125 16:19:01.102455 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"aa4a8c5c-3c11-45e5-815e-bebe62e1b165","Type":"ContainerStarted","Data":"cd0ffe2751002a1bf318588bfd0862e7ffc8ae1869fe1067cfaa0427472dfbda"} Nov 25 16:19:01 crc kubenswrapper[4743]: I1125 16:19:01.103672 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 25 16:19:01 crc kubenswrapper[4743]: I1125 16:19:01.109239 4743 generic.go:334] "Generic (PLEG): container finished" podID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerID="3afc52f24001ca19c30451080a22a7e1efba0020efd2f70efaf92a98646c0a4c" exitCode=0 Nov 25 16:19:01 crc kubenswrapper[4743]: I1125 16:19:01.109299 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9","Type":"ContainerDied","Data":"3afc52f24001ca19c30451080a22a7e1efba0020efd2f70efaf92a98646c0a4c"} Nov 25 16:19:01 crc kubenswrapper[4743]: I1125 16:19:01.128780 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.7637026900000001 podStartE2EDuration="2.128753779s" podCreationTimestamp="2025-11-25 16:18:59 +0000 UTC" firstStartedPulling="2025-11-25 16:18:59.890153218 +0000 UTC m=+1219.011992767" lastFinishedPulling="2025-11-25 16:19:00.255204307 +0000 UTC m=+1219.377043856" observedRunningTime="2025-11-25 16:19:01.11954497 +0000 UTC m=+1220.241384519" watchObservedRunningTime="2025-11-25 16:19:01.128753779 +0000 UTC m=+1220.250593328" Nov 25 16:19:03 crc kubenswrapper[4743]: I1125 16:19:03.177683 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 16:19:03 crc kubenswrapper[4743]: I1125 16:19:03.180668 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 16:19:03 crc kubenswrapper[4743]: I1125 16:19:03.185960 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.144650 4743 generic.go:334] "Generic (PLEG): container finished" podID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerID="8aaeac91567ec44ef2dd6659147c4b1ccdf16e44751efc2fd056a6d8d62fd0db" exitCode=0 Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.144717 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9","Type":"ContainerDied","Data":"8aaeac91567ec44ef2dd6659147c4b1ccdf16e44751efc2fd056a6d8d62fd0db"} Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.151791 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.414254 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.502573 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-log-httpd\") pod \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.502653 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-config-data\") pod \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.502714 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-combined-ca-bundle\") pod \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.502788 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-run-httpd\") pod \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.502834 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-scripts\") pod \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.502903 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-sg-core-conf-yaml\") pod \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.502932 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwgt2\" (UniqueName: \"kubernetes.io/projected/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-kube-api-access-jwgt2\") pod \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\" (UID: \"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9\") " Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.503056 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" (UID: "5561798a-48bc-4ffa-9ce3-0e7278c5f1b9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.503307 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.503738 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" (UID: "5561798a-48bc-4ffa-9ce3-0e7278c5f1b9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.507803 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-scripts" (OuterVolumeSpecName: "scripts") pod "5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" (UID: "5561798a-48bc-4ffa-9ce3-0e7278c5f1b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.507955 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-kube-api-access-jwgt2" (OuterVolumeSpecName: "kube-api-access-jwgt2") pod "5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" (UID: "5561798a-48bc-4ffa-9ce3-0e7278c5f1b9"). InnerVolumeSpecName "kube-api-access-jwgt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.530164 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" (UID: "5561798a-48bc-4ffa-9ce3-0e7278c5f1b9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.604784 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.604818 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.604827 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.604840 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwgt2\" (UniqueName: \"kubernetes.io/projected/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-kube-api-access-jwgt2\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.608367 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" (UID: "5561798a-48bc-4ffa-9ce3-0e7278c5f1b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.616465 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-config-data" (OuterVolumeSpecName: "config-data") pod "5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" (UID: "5561798a-48bc-4ffa-9ce3-0e7278c5f1b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.706876 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:04 crc kubenswrapper[4743]: I1125 16:19:04.706907 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.156375 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.156366 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5561798a-48bc-4ffa-9ce3-0e7278c5f1b9","Type":"ContainerDied","Data":"7c3c925bd8be5a919b6233e1d892b59fd6a5cd55df34bf16a1aa9cff10b312df"} Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.156442 4743 scope.go:117] "RemoveContainer" containerID="0bddbc7167c39bdba6e98da9ec5b9bde727dc0ea0462e1a2f650cdbf02d4baf4" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.177009 4743 scope.go:117] "RemoveContainer" containerID="d6ba2bc1c09d5a805c44a82ab20eec6df2a0291b5185fb6ed6ca855c3f733f22" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.190653 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.200334 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.207578 4743 scope.go:117] "RemoveContainer" containerID="8aaeac91567ec44ef2dd6659147c4b1ccdf16e44751efc2fd056a6d8d62fd0db" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.209470 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.209934 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.210985 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.216265 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:19:05 crc kubenswrapper[4743]: E1125 16:19:05.216809 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="ceilometer-notification-agent" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.216878 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="ceilometer-notification-agent" Nov 25 16:19:05 crc kubenswrapper[4743]: E1125 16:19:05.216938 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="proxy-httpd" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.216989 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="proxy-httpd" Nov 25 16:19:05 crc kubenswrapper[4743]: E1125 16:19:05.217055 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="ceilometer-central-agent" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.217117 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="ceilometer-central-agent" Nov 25 16:19:05 crc kubenswrapper[4743]: E1125 16:19:05.217180 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="sg-core" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.217239 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="sg-core" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.220121 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="ceilometer-notification-agent" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.220269 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="sg-core" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.220331 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="proxy-httpd" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.220489 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" containerName="ceilometer-central-agent" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.222990 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.228688 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.231129 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.231432 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.234056 4743 scope.go:117] "RemoveContainer" containerID="3afc52f24001ca19c30451080a22a7e1efba0020efd2f70efaf92a98646c0a4c" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.234434 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.234561 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.316411 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95304982-4885-4344-914e-1a4693b5eed1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.316616 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95304982-4885-4344-914e-1a4693b5eed1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.316677 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95304982-4885-4344-914e-1a4693b5eed1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.316711 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95304982-4885-4344-914e-1a4693b5eed1-run-httpd\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.316738 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95304982-4885-4344-914e-1a4693b5eed1-scripts\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.316753 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95304982-4885-4344-914e-1a4693b5eed1-log-httpd\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.316769 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95304982-4885-4344-914e-1a4693b5eed1-config-data\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.316798 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-299hk\" (UniqueName: \"kubernetes.io/projected/95304982-4885-4344-914e-1a4693b5eed1-kube-api-access-299hk\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.418730 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95304982-4885-4344-914e-1a4693b5eed1-scripts\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.418788 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95304982-4885-4344-914e-1a4693b5eed1-log-httpd\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.418813 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95304982-4885-4344-914e-1a4693b5eed1-config-data\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.418846 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-299hk\" (UniqueName: \"kubernetes.io/projected/95304982-4885-4344-914e-1a4693b5eed1-kube-api-access-299hk\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.418873 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95304982-4885-4344-914e-1a4693b5eed1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.418978 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95304982-4885-4344-914e-1a4693b5eed1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.419032 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95304982-4885-4344-914e-1a4693b5eed1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.419071 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95304982-4885-4344-914e-1a4693b5eed1-run-httpd\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.419538 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95304982-4885-4344-914e-1a4693b5eed1-run-httpd\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.421033 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95304982-4885-4344-914e-1a4693b5eed1-log-httpd\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.425093 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95304982-4885-4344-914e-1a4693b5eed1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.425657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/95304982-4885-4344-914e-1a4693b5eed1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.425857 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95304982-4885-4344-914e-1a4693b5eed1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.426436 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95304982-4885-4344-914e-1a4693b5eed1-scripts\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.427103 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95304982-4885-4344-914e-1a4693b5eed1-config-data\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.443126 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-299hk\" (UniqueName: \"kubernetes.io/projected/95304982-4885-4344-914e-1a4693b5eed1-kube-api-access-299hk\") pod \"ceilometer-0\" (UID: \"95304982-4885-4344-914e-1a4693b5eed1\") " pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.548050 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 16:19:05 crc kubenswrapper[4743]: I1125 16:19:05.808981 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5561798a-48bc-4ffa-9ce3-0e7278c5f1b9" path="/var/lib/kubelet/pods/5561798a-48bc-4ffa-9ce3-0e7278c5f1b9/volumes" Nov 25 16:19:06 crc kubenswrapper[4743]: I1125 16:19:06.000925 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 16:19:06 crc kubenswrapper[4743]: W1125 16:19:06.003771 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95304982_4885_4344_914e_1a4693b5eed1.slice/crio-8d0916705e224a265e4dac20b6583d097cae8844710944dd54dde5f908b2fa89 WatchSource:0}: Error finding container 8d0916705e224a265e4dac20b6583d097cae8844710944dd54dde5f908b2fa89: Status 404 returned error can't find the container with id 8d0916705e224a265e4dac20b6583d097cae8844710944dd54dde5f908b2fa89 Nov 25 16:19:06 crc kubenswrapper[4743]: I1125 16:19:06.008193 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 16:19:06 crc kubenswrapper[4743]: I1125 16:19:06.165944 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95304982-4885-4344-914e-1a4693b5eed1","Type":"ContainerStarted","Data":"8d0916705e224a265e4dac20b6583d097cae8844710944dd54dde5f908b2fa89"} Nov 25 16:19:06 crc kubenswrapper[4743]: I1125 16:19:06.167760 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 16:19:06 crc kubenswrapper[4743]: I1125 16:19:06.173327 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 16:19:07 crc kubenswrapper[4743]: I1125 16:19:07.177774 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95304982-4885-4344-914e-1a4693b5eed1","Type":"ContainerStarted","Data":"e67a49e8ecc672c4d86e8ed6f4d0e6c6bb8718c2e37fe4f1318b143cb5693d04"} Nov 25 16:19:08 crc kubenswrapper[4743]: I1125 16:19:08.193630 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95304982-4885-4344-914e-1a4693b5eed1","Type":"ContainerStarted","Data":"48a601d52e3867521b1360c477c457f6811b19590f919e6c394a5fc0e7a0ce6d"} Nov 25 16:19:09 crc kubenswrapper[4743]: I1125 16:19:09.203984 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95304982-4885-4344-914e-1a4693b5eed1","Type":"ContainerStarted","Data":"97283a12f5812a70587914add694693c2e9b31d392c72ff6bb98276b6255f8a4"} Nov 25 16:19:09 crc kubenswrapper[4743]: I1125 16:19:09.461003 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 25 16:19:10 crc kubenswrapper[4743]: I1125 16:19:10.216387 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95304982-4885-4344-914e-1a4693b5eed1","Type":"ContainerStarted","Data":"0d0a14b1665b8072e862550ede014a6a8f490ec17e8a01c2f4b30bfc7200fde2"} Nov 25 16:19:10 crc kubenswrapper[4743]: I1125 16:19:10.216984 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 16:19:10 crc kubenswrapper[4743]: I1125 16:19:10.243056 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.413165432 podStartE2EDuration="5.243036166s" podCreationTimestamp="2025-11-25 16:19:05 +0000 UTC" firstStartedPulling="2025-11-25 16:19:06.007959942 +0000 UTC m=+1225.129799491" lastFinishedPulling="2025-11-25 16:19:09.837830676 +0000 UTC m=+1228.959670225" observedRunningTime="2025-11-25 16:19:10.23519683 +0000 UTC m=+1229.357036399" watchObservedRunningTime="2025-11-25 16:19:10.243036166 +0000 UTC m=+1229.364875715" Nov 25 16:19:35 crc kubenswrapper[4743]: I1125 16:19:35.559437 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 25 16:19:47 crc kubenswrapper[4743]: I1125 16:19:47.052197 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 16:19:47 crc kubenswrapper[4743]: I1125 16:19:47.767227 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 16:19:52 crc kubenswrapper[4743]: I1125 16:19:52.401760 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="32600c5f-46d2-441f-bda1-2ca9e0c35f35" containerName="rabbitmq" containerID="cri-o://43ab4b9eeb6f340ae519a2c0e63dcb9fe4e8bb8363b26f248c8b28b89464c584" gracePeriod=604796 Nov 25 16:19:52 crc kubenswrapper[4743]: I1125 16:19:52.466077 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="99b737b1-8d17-4abc-a898-1ceedff80421" containerName="rabbitmq" containerID="cri-o://ad6e3c2cc475c993ab8907a59b69de29a261d63bfc818e982496f6b89ca379c6" gracePeriod=604795 Nov 25 16:19:54 crc kubenswrapper[4743]: I1125 16:19:54.645370 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="99b737b1-8d17-4abc-a898-1ceedff80421" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Nov 25 16:19:54 crc kubenswrapper[4743]: I1125 16:19:54.904306 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="32600c5f-46d2-441f-bda1-2ca9e0c35f35" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.262493 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.275116 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379366 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-plugins-conf\") pod \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379420 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-erlang-cookie\") pod \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379456 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/99b737b1-8d17-4abc-a898-1ceedff80421-pod-info\") pod \"99b737b1-8d17-4abc-a898-1ceedff80421\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379475 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/32600c5f-46d2-441f-bda1-2ca9e0c35f35-erlang-cookie-secret\") pod \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379538 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-plugins\") pod \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379559 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-plugins\") pod \"99b737b1-8d17-4abc-a898-1ceedff80421\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379582 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-confd\") pod \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379717 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6rdt\" (UniqueName: \"kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-kube-api-access-f6rdt\") pod \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379751 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-config-data\") pod \"99b737b1-8d17-4abc-a898-1ceedff80421\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379788 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-server-conf\") pod \"99b737b1-8d17-4abc-a898-1ceedff80421\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379809 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-plugins-conf\") pod \"99b737b1-8d17-4abc-a898-1ceedff80421\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379854 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/99b737b1-8d17-4abc-a898-1ceedff80421-erlang-cookie-secret\") pod \"99b737b1-8d17-4abc-a898-1ceedff80421\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379877 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-tls\") pod \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379922 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-confd\") pod \"99b737b1-8d17-4abc-a898-1ceedff80421\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379953 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.379966 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"99b737b1-8d17-4abc-a898-1ceedff80421\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.380013 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-server-conf\") pod \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.380033 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qpvg\" (UniqueName: \"kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-kube-api-access-8qpvg\") pod \"99b737b1-8d17-4abc-a898-1ceedff80421\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.380055 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-erlang-cookie\") pod \"99b737b1-8d17-4abc-a898-1ceedff80421\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.380088 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-tls\") pod \"99b737b1-8d17-4abc-a898-1ceedff80421\" (UID: \"99b737b1-8d17-4abc-a898-1ceedff80421\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.380112 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/32600c5f-46d2-441f-bda1-2ca9e0c35f35-pod-info\") pod \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.380132 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-config-data\") pod \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\" (UID: \"32600c5f-46d2-441f-bda1-2ca9e0c35f35\") " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.382659 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "99b737b1-8d17-4abc-a898-1ceedff80421" (UID: "99b737b1-8d17-4abc-a898-1ceedff80421"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.383082 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "32600c5f-46d2-441f-bda1-2ca9e0c35f35" (UID: "32600c5f-46d2-441f-bda1-2ca9e0c35f35"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.383469 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "99b737b1-8d17-4abc-a898-1ceedff80421" (UID: "99b737b1-8d17-4abc-a898-1ceedff80421"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.385835 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "99b737b1-8d17-4abc-a898-1ceedff80421" (UID: "99b737b1-8d17-4abc-a898-1ceedff80421"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.389279 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "32600c5f-46d2-441f-bda1-2ca9e0c35f35" (UID: "32600c5f-46d2-441f-bda1-2ca9e0c35f35"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.391737 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "99b737b1-8d17-4abc-a898-1ceedff80421" (UID: "99b737b1-8d17-4abc-a898-1ceedff80421"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.395440 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "32600c5f-46d2-441f-bda1-2ca9e0c35f35" (UID: "32600c5f-46d2-441f-bda1-2ca9e0c35f35"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.395539 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32600c5f-46d2-441f-bda1-2ca9e0c35f35-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "32600c5f-46d2-441f-bda1-2ca9e0c35f35" (UID: "32600c5f-46d2-441f-bda1-2ca9e0c35f35"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.395909 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/99b737b1-8d17-4abc-a898-1ceedff80421-pod-info" (OuterVolumeSpecName: "pod-info") pod "99b737b1-8d17-4abc-a898-1ceedff80421" (UID: "99b737b1-8d17-4abc-a898-1ceedff80421"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.399749 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-kube-api-access-8qpvg" (OuterVolumeSpecName: "kube-api-access-8qpvg") pod "99b737b1-8d17-4abc-a898-1ceedff80421" (UID: "99b737b1-8d17-4abc-a898-1ceedff80421"). InnerVolumeSpecName "kube-api-access-8qpvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.425937 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-kube-api-access-f6rdt" (OuterVolumeSpecName: "kube-api-access-f6rdt") pod "32600c5f-46d2-441f-bda1-2ca9e0c35f35" (UID: "32600c5f-46d2-441f-bda1-2ca9e0c35f35"). InnerVolumeSpecName "kube-api-access-f6rdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.425945 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99b737b1-8d17-4abc-a898-1ceedff80421-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "99b737b1-8d17-4abc-a898-1ceedff80421" (UID: "99b737b1-8d17-4abc-a898-1ceedff80421"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.428412 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "32600c5f-46d2-441f-bda1-2ca9e0c35f35" (UID: "32600c5f-46d2-441f-bda1-2ca9e0c35f35"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.428668 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "99b737b1-8d17-4abc-a898-1ceedff80421" (UID: "99b737b1-8d17-4abc-a898-1ceedff80421"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.451820 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/32600c5f-46d2-441f-bda1-2ca9e0c35f35-pod-info" (OuterVolumeSpecName: "pod-info") pod "32600c5f-46d2-441f-bda1-2ca9e0c35f35" (UID: "32600c5f-46d2-441f-bda1-2ca9e0c35f35"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.471313 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "32600c5f-46d2-441f-bda1-2ca9e0c35f35" (UID: "32600c5f-46d2-441f-bda1-2ca9e0c35f35"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.486996 4743 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/99b737b1-8d17-4abc-a898-1ceedff80421-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.487046 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.487075 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.487092 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.487106 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qpvg\" (UniqueName: \"kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-kube-api-access-8qpvg\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.487148 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.487159 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.487168 4743 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/32600c5f-46d2-441f-bda1-2ca9e0c35f35-pod-info\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.487180 4743 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.487191 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.487204 4743 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/99b737b1-8d17-4abc-a898-1ceedff80421-pod-info\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.487215 4743 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/32600c5f-46d2-441f-bda1-2ca9e0c35f35-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.487225 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.487236 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.487248 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6rdt\" (UniqueName: \"kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-kube-api-access-f6rdt\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.487259 4743 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.548544 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-config-data" (OuterVolumeSpecName: "config-data") pod "99b737b1-8d17-4abc-a898-1ceedff80421" (UID: "99b737b1-8d17-4abc-a898-1ceedff80421"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.589849 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.590912 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.606425 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-config-data" (OuterVolumeSpecName: "config-data") pod "32600c5f-46d2-441f-bda1-2ca9e0c35f35" (UID: "32600c5f-46d2-441f-bda1-2ca9e0c35f35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.644035 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.660651 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-server-conf" (OuterVolumeSpecName: "server-conf") pod "32600c5f-46d2-441f-bda1-2ca9e0c35f35" (UID: "32600c5f-46d2-441f-bda1-2ca9e0c35f35"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.676153 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-server-conf" (OuterVolumeSpecName: "server-conf") pod "99b737b1-8d17-4abc-a898-1ceedff80421" (UID: "99b737b1-8d17-4abc-a898-1ceedff80421"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.694073 4743 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/99b737b1-8d17-4abc-a898-1ceedff80421-server-conf\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.694116 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.694130 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.694142 4743 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-server-conf\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.694155 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32600c5f-46d2-441f-bda1-2ca9e0c35f35-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.696897 4743 generic.go:334] "Generic (PLEG): container finished" podID="99b737b1-8d17-4abc-a898-1ceedff80421" containerID="ad6e3c2cc475c993ab8907a59b69de29a261d63bfc818e982496f6b89ca379c6" exitCode=0 Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.696994 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"99b737b1-8d17-4abc-a898-1ceedff80421","Type":"ContainerDied","Data":"ad6e3c2cc475c993ab8907a59b69de29a261d63bfc818e982496f6b89ca379c6"} Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.697029 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"99b737b1-8d17-4abc-a898-1ceedff80421","Type":"ContainerDied","Data":"59a57cdd3a977ad2ed968a380c17af996a8977a8a484ea7ed1f32142b594ffc3"} Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.697048 4743 scope.go:117] "RemoveContainer" containerID="ad6e3c2cc475c993ab8907a59b69de29a261d63bfc818e982496f6b89ca379c6" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.697188 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.735095 4743 generic.go:334] "Generic (PLEG): container finished" podID="32600c5f-46d2-441f-bda1-2ca9e0c35f35" containerID="43ab4b9eeb6f340ae519a2c0e63dcb9fe4e8bb8363b26f248c8b28b89464c584" exitCode=0 Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.735250 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"32600c5f-46d2-441f-bda1-2ca9e0c35f35","Type":"ContainerDied","Data":"43ab4b9eeb6f340ae519a2c0e63dcb9fe4e8bb8363b26f248c8b28b89464c584"} Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.735288 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"32600c5f-46d2-441f-bda1-2ca9e0c35f35","Type":"ContainerDied","Data":"b2c07c46b4e775ca1fd0ee1f9e2d73a6f089dede606b1ddfd7232bdfe47a0125"} Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.735435 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.745976 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "32600c5f-46d2-441f-bda1-2ca9e0c35f35" (UID: "32600c5f-46d2-441f-bda1-2ca9e0c35f35"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.760309 4743 scope.go:117] "RemoveContainer" containerID="45011f800605d00f351de8a2e8909dedde2295779fa6c1754dd8036fcd5511e6" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.790383 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "99b737b1-8d17-4abc-a898-1ceedff80421" (UID: "99b737b1-8d17-4abc-a898-1ceedff80421"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.797096 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/32600c5f-46d2-441f-bda1-2ca9e0c35f35-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.797173 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/99b737b1-8d17-4abc-a898-1ceedff80421-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.808004 4743 scope.go:117] "RemoveContainer" containerID="ad6e3c2cc475c993ab8907a59b69de29a261d63bfc818e982496f6b89ca379c6" Nov 25 16:19:59 crc kubenswrapper[4743]: E1125 16:19:59.808386 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad6e3c2cc475c993ab8907a59b69de29a261d63bfc818e982496f6b89ca379c6\": container with ID starting with ad6e3c2cc475c993ab8907a59b69de29a261d63bfc818e982496f6b89ca379c6 not found: ID does not exist" containerID="ad6e3c2cc475c993ab8907a59b69de29a261d63bfc818e982496f6b89ca379c6" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.808436 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad6e3c2cc475c993ab8907a59b69de29a261d63bfc818e982496f6b89ca379c6"} err="failed to get container status \"ad6e3c2cc475c993ab8907a59b69de29a261d63bfc818e982496f6b89ca379c6\": rpc error: code = NotFound desc = could not find container \"ad6e3c2cc475c993ab8907a59b69de29a261d63bfc818e982496f6b89ca379c6\": container with ID starting with ad6e3c2cc475c993ab8907a59b69de29a261d63bfc818e982496f6b89ca379c6 not found: ID does not exist" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.808473 4743 scope.go:117] "RemoveContainer" containerID="45011f800605d00f351de8a2e8909dedde2295779fa6c1754dd8036fcd5511e6" Nov 25 16:19:59 crc kubenswrapper[4743]: E1125 16:19:59.809008 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45011f800605d00f351de8a2e8909dedde2295779fa6c1754dd8036fcd5511e6\": container with ID starting with 45011f800605d00f351de8a2e8909dedde2295779fa6c1754dd8036fcd5511e6 not found: ID does not exist" containerID="45011f800605d00f351de8a2e8909dedde2295779fa6c1754dd8036fcd5511e6" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.809039 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45011f800605d00f351de8a2e8909dedde2295779fa6c1754dd8036fcd5511e6"} err="failed to get container status \"45011f800605d00f351de8a2e8909dedde2295779fa6c1754dd8036fcd5511e6\": rpc error: code = NotFound desc = could not find container \"45011f800605d00f351de8a2e8909dedde2295779fa6c1754dd8036fcd5511e6\": container with ID starting with 45011f800605d00f351de8a2e8909dedde2295779fa6c1754dd8036fcd5511e6 not found: ID does not exist" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.809056 4743 scope.go:117] "RemoveContainer" containerID="43ab4b9eeb6f340ae519a2c0e63dcb9fe4e8bb8363b26f248c8b28b89464c584" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.913012 4743 scope.go:117] "RemoveContainer" containerID="e571dd70fc87a20ea9646cd7b5a93fac691cefd57db0cf105731fd3bd0dd22d9" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.954892 4743 scope.go:117] "RemoveContainer" containerID="43ab4b9eeb6f340ae519a2c0e63dcb9fe4e8bb8363b26f248c8b28b89464c584" Nov 25 16:19:59 crc kubenswrapper[4743]: E1125 16:19:59.959344 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ab4b9eeb6f340ae519a2c0e63dcb9fe4e8bb8363b26f248c8b28b89464c584\": container with ID starting with 43ab4b9eeb6f340ae519a2c0e63dcb9fe4e8bb8363b26f248c8b28b89464c584 not found: ID does not exist" containerID="43ab4b9eeb6f340ae519a2c0e63dcb9fe4e8bb8363b26f248c8b28b89464c584" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.959377 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ab4b9eeb6f340ae519a2c0e63dcb9fe4e8bb8363b26f248c8b28b89464c584"} err="failed to get container status \"43ab4b9eeb6f340ae519a2c0e63dcb9fe4e8bb8363b26f248c8b28b89464c584\": rpc error: code = NotFound desc = could not find container \"43ab4b9eeb6f340ae519a2c0e63dcb9fe4e8bb8363b26f248c8b28b89464c584\": container with ID starting with 43ab4b9eeb6f340ae519a2c0e63dcb9fe4e8bb8363b26f248c8b28b89464c584 not found: ID does not exist" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.959399 4743 scope.go:117] "RemoveContainer" containerID="e571dd70fc87a20ea9646cd7b5a93fac691cefd57db0cf105731fd3bd0dd22d9" Nov 25 16:19:59 crc kubenswrapper[4743]: E1125 16:19:59.959718 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e571dd70fc87a20ea9646cd7b5a93fac691cefd57db0cf105731fd3bd0dd22d9\": container with ID starting with e571dd70fc87a20ea9646cd7b5a93fac691cefd57db0cf105731fd3bd0dd22d9 not found: ID does not exist" containerID="e571dd70fc87a20ea9646cd7b5a93fac691cefd57db0cf105731fd3bd0dd22d9" Nov 25 16:19:59 crc kubenswrapper[4743]: I1125 16:19:59.959744 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e571dd70fc87a20ea9646cd7b5a93fac691cefd57db0cf105731fd3bd0dd22d9"} err="failed to get container status \"e571dd70fc87a20ea9646cd7b5a93fac691cefd57db0cf105731fd3bd0dd22d9\": rpc error: code = NotFound desc = could not find container \"e571dd70fc87a20ea9646cd7b5a93fac691cefd57db0cf105731fd3bd0dd22d9\": container with ID starting with e571dd70fc87a20ea9646cd7b5a93fac691cefd57db0cf105731fd3bd0dd22d9 not found: ID does not exist" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.028320 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.039749 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.057841 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 16:20:00 crc kubenswrapper[4743]: E1125 16:20:00.058212 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32600c5f-46d2-441f-bda1-2ca9e0c35f35" containerName="rabbitmq" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.058229 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="32600c5f-46d2-441f-bda1-2ca9e0c35f35" containerName="rabbitmq" Nov 25 16:20:00 crc kubenswrapper[4743]: E1125 16:20:00.058248 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b737b1-8d17-4abc-a898-1ceedff80421" containerName="setup-container" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.058258 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b737b1-8d17-4abc-a898-1ceedff80421" containerName="setup-container" Nov 25 16:20:00 crc kubenswrapper[4743]: E1125 16:20:00.058282 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32600c5f-46d2-441f-bda1-2ca9e0c35f35" containerName="setup-container" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.058288 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="32600c5f-46d2-441f-bda1-2ca9e0c35f35" containerName="setup-container" Nov 25 16:20:00 crc kubenswrapper[4743]: E1125 16:20:00.058305 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b737b1-8d17-4abc-a898-1ceedff80421" containerName="rabbitmq" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.058311 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b737b1-8d17-4abc-a898-1ceedff80421" containerName="rabbitmq" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.058609 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="99b737b1-8d17-4abc-a898-1ceedff80421" containerName="rabbitmq" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.058663 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="32600c5f-46d2-441f-bda1-2ca9e0c35f35" containerName="rabbitmq" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.061114 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.063322 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.063519 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.063661 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.063993 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.064251 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2dmzd" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.064400 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.064899 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.085203 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.102933 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.116727 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.129697 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.132505 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.139371 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.139706 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.139785 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.139798 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.139828 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.139888 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.141411 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-zzg9l" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.161424 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.206939 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1337639a-d66d-43cb-a7d9-487f22d1d804-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.207172 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.207226 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1337639a-d66d-43cb-a7d9-487f22d1d804-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.207254 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1337639a-d66d-43cb-a7d9-487f22d1d804-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.207355 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1337639a-d66d-43cb-a7d9-487f22d1d804-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.207455 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1337639a-d66d-43cb-a7d9-487f22d1d804-config-data\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.207542 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5txmk\" (UniqueName: \"kubernetes.io/projected/1337639a-d66d-43cb-a7d9-487f22d1d804-kube-api-access-5txmk\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.207637 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1337639a-d66d-43cb-a7d9-487f22d1d804-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.207697 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1337639a-d66d-43cb-a7d9-487f22d1d804-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.207862 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1337639a-d66d-43cb-a7d9-487f22d1d804-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.207894 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1337639a-d66d-43cb-a7d9-487f22d1d804-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.309510 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.309571 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.309621 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1337639a-d66d-43cb-a7d9-487f22d1d804-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.309670 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1337639a-d66d-43cb-a7d9-487f22d1d804-config-data\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.309703 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5txmk\" (UniqueName: \"kubernetes.io/projected/1337639a-d66d-43cb-a7d9-487f22d1d804-kube-api-access-5txmk\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.309732 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8chbn\" (UniqueName: \"kubernetes.io/projected/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-kube-api-access-8chbn\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.309763 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1337639a-d66d-43cb-a7d9-487f22d1d804-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.309796 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1337639a-d66d-43cb-a7d9-487f22d1d804-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.309835 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.309946 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.309995 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.310018 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1337639a-d66d-43cb-a7d9-487f22d1d804-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.310043 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1337639a-d66d-43cb-a7d9-487f22d1d804-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.310089 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.310127 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.310241 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.310267 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1337639a-d66d-43cb-a7d9-487f22d1d804-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.310316 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.310375 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.310438 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.310466 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1337639a-d66d-43cb-a7d9-487f22d1d804-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.310492 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1337639a-d66d-43cb-a7d9-487f22d1d804-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.310993 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1337639a-d66d-43cb-a7d9-487f22d1d804-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.311802 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1337639a-d66d-43cb-a7d9-487f22d1d804-config-data\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.312887 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1337639a-d66d-43cb-a7d9-487f22d1d804-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.313183 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1337639a-d66d-43cb-a7d9-487f22d1d804-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.313317 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.313984 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1337639a-d66d-43cb-a7d9-487f22d1d804-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.314728 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1337639a-d66d-43cb-a7d9-487f22d1d804-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.316658 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1337639a-d66d-43cb-a7d9-487f22d1d804-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.326555 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1337639a-d66d-43cb-a7d9-487f22d1d804-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.330803 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1337639a-d66d-43cb-a7d9-487f22d1d804-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.335224 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5txmk\" (UniqueName: \"kubernetes.io/projected/1337639a-d66d-43cb-a7d9-487f22d1d804-kube-api-access-5txmk\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.352269 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"1337639a-d66d-43cb-a7d9-487f22d1d804\") " pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.393123 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.412337 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8chbn\" (UniqueName: \"kubernetes.io/projected/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-kube-api-access-8chbn\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.412410 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.412440 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.412457 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.412480 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.412499 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.412532 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.412555 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.412579 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.412623 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.412645 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.412842 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.414885 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.416279 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.417377 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.418132 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.418535 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.418865 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.419049 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.424481 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.425170 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.433650 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8chbn\" (UniqueName: \"kubernetes.io/projected/f54afd9a-9279-4fd3-a14a-6742d1ad9d96-kube-api-access-8chbn\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.449954 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f54afd9a-9279-4fd3-a14a-6742d1ad9d96\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.459091 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.872525 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 16:20:00 crc kubenswrapper[4743]: I1125 16:20:00.993293 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 16:20:01 crc kubenswrapper[4743]: I1125 16:20:01.758564 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1337639a-d66d-43cb-a7d9-487f22d1d804","Type":"ContainerStarted","Data":"eca53a5fe7e49bd401c6546d3b2c7067ed9f1f365006e7b329964c6a11bd0980"} Nov 25 16:20:01 crc kubenswrapper[4743]: I1125 16:20:01.761114 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f54afd9a-9279-4fd3-a14a-6742d1ad9d96","Type":"ContainerStarted","Data":"e195db8c87e8d05ce6585cc5d380600da605a9e63826f10072c4423eafb3af08"} Nov 25 16:20:01 crc kubenswrapper[4743]: I1125 16:20:01.788203 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32600c5f-46d2-441f-bda1-2ca9e0c35f35" path="/var/lib/kubelet/pods/32600c5f-46d2-441f-bda1-2ca9e0c35f35/volumes" Nov 25 16:20:01 crc kubenswrapper[4743]: I1125 16:20:01.789120 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99b737b1-8d17-4abc-a898-1ceedff80421" path="/var/lib/kubelet/pods/99b737b1-8d17-4abc-a898-1ceedff80421/volumes" Nov 25 16:20:02 crc kubenswrapper[4743]: I1125 16:20:02.770284 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1337639a-d66d-43cb-a7d9-487f22d1d804","Type":"ContainerStarted","Data":"cec9e7d365fb1ae9a3e6054c3ea5e8284eccd40d2189b86a6e4460f8cc231e19"} Nov 25 16:20:02 crc kubenswrapper[4743]: I1125 16:20:02.773009 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f54afd9a-9279-4fd3-a14a-6742d1ad9d96","Type":"ContainerStarted","Data":"e07cd32836f1c4d84a648ddbf4b09453379b6da58eed75bb655695de34a57f78"} Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.166390 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-s5fsm"] Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.170974 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.173829 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.184445 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-s5fsm"] Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.255369 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-config\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.255533 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.255679 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.255714 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.255769 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.255906 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.256008 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw8ph\" (UniqueName: \"kubernetes.io/projected/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-kube-api-access-tw8ph\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.358032 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-config\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.358428 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.358559 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.358684 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.358783 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.358889 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.359012 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw8ph\" (UniqueName: \"kubernetes.io/projected/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-kube-api-access-tw8ph\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.359242 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-config\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.359480 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.359529 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.359851 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.360217 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.360250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.380985 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw8ph\" (UniqueName: \"kubernetes.io/projected/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-kube-api-access-tw8ph\") pod \"dnsmasq-dns-79bd4cc8c9-s5fsm\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.489777 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:08 crc kubenswrapper[4743]: I1125 16:20:08.909880 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-s5fsm"] Nov 25 16:20:09 crc kubenswrapper[4743]: I1125 16:20:09.869984 4743 generic.go:334] "Generic (PLEG): container finished" podID="eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" containerID="61addee65fa2dd43d2be4c16035a0751b2514d3d7aa988214e34add192cacc49" exitCode=0 Nov 25 16:20:09 crc kubenswrapper[4743]: I1125 16:20:09.870138 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" event={"ID":"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc","Type":"ContainerDied","Data":"61addee65fa2dd43d2be4c16035a0751b2514d3d7aa988214e34add192cacc49"} Nov 25 16:20:09 crc kubenswrapper[4743]: I1125 16:20:09.870261 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" event={"ID":"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc","Type":"ContainerStarted","Data":"bec1c5390b57b9f4cd9772ba98f6edb005f6fcd880e3b7f2a4898a0355c8d8a2"} Nov 25 16:20:10 crc kubenswrapper[4743]: I1125 16:20:10.880254 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" event={"ID":"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc","Type":"ContainerStarted","Data":"e5445274210269b8ab5107b5e117b1cb0091ef4683f0e516818aa84614c07b56"} Nov 25 16:20:10 crc kubenswrapper[4743]: I1125 16:20:10.880704 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:10 crc kubenswrapper[4743]: I1125 16:20:10.908215 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" podStartSLOduration=2.90819602 podStartE2EDuration="2.90819602s" podCreationTimestamp="2025-11-25 16:20:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:20:10.90215075 +0000 UTC m=+1290.023990299" watchObservedRunningTime="2025-11-25 16:20:10.90819602 +0000 UTC m=+1290.030035569" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.491821 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.554001 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kxj8h"] Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.554346 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" podUID="fdc16f4c-0cde-4c16-86b8-44b0cab38e72" containerName="dnsmasq-dns" containerID="cri-o://e3bccbc836f8f554ab576d3a944443744189ea4a19278f3f39b19238bed50ff1" gracePeriod=10 Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.690304 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-96jwb"] Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.692385 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.725651 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-96jwb"] Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.845836 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-dns-svc\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.846425 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.846486 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.846506 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.846683 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-config\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.846859 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s885\" (UniqueName: \"kubernetes.io/projected/a587d785-9e96-41ef-95b8-a247f530e971-kube-api-access-2s885\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.846988 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.948670 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-dns-svc\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.948749 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.948799 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.948824 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.948847 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-config\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.948910 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s885\" (UniqueName: \"kubernetes.io/projected/a587d785-9e96-41ef-95b8-a247f530e971-kube-api-access-2s885\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.948972 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.949670 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-dns-svc\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.950421 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.950730 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.950757 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-config\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.952250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.952487 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a587d785-9e96-41ef-95b8-a247f530e971-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.967164 4743 generic.go:334] "Generic (PLEG): container finished" podID="fdc16f4c-0cde-4c16-86b8-44b0cab38e72" containerID="e3bccbc836f8f554ab576d3a944443744189ea4a19278f3f39b19238bed50ff1" exitCode=0 Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.967203 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" event={"ID":"fdc16f4c-0cde-4c16-86b8-44b0cab38e72","Type":"ContainerDied","Data":"e3bccbc836f8f554ab576d3a944443744189ea4a19278f3f39b19238bed50ff1"} Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.967228 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" event={"ID":"fdc16f4c-0cde-4c16-86b8-44b0cab38e72","Type":"ContainerDied","Data":"9aeea0fb2b8516cd4b0d4c380b22fd7d6edaa6635b494e301edabef474bb05ce"} Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.967238 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9aeea0fb2b8516cd4b0d4c380b22fd7d6edaa6635b494e301edabef474bb05ce" Nov 25 16:20:18 crc kubenswrapper[4743]: I1125 16:20:18.968951 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s885\" (UniqueName: \"kubernetes.io/projected/a587d785-9e96-41ef-95b8-a247f530e971-kube-api-access-2s885\") pod \"dnsmasq-dns-55478c4467-96jwb\" (UID: \"a587d785-9e96-41ef-95b8-a247f530e971\") " pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.038496 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.045907 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.151811 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5tcw\" (UniqueName: \"kubernetes.io/projected/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-kube-api-access-v5tcw\") pod \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.151847 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-ovsdbserver-nb\") pod \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.152061 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-dns-swift-storage-0\") pod \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.152099 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-dns-svc\") pod \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.152120 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-ovsdbserver-sb\") pod \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.152149 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-config\") pod \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\" (UID: \"fdc16f4c-0cde-4c16-86b8-44b0cab38e72\") " Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.158840 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-kube-api-access-v5tcw" (OuterVolumeSpecName: "kube-api-access-v5tcw") pod "fdc16f4c-0cde-4c16-86b8-44b0cab38e72" (UID: "fdc16f4c-0cde-4c16-86b8-44b0cab38e72"). InnerVolumeSpecName "kube-api-access-v5tcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.200537 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fdc16f4c-0cde-4c16-86b8-44b0cab38e72" (UID: "fdc16f4c-0cde-4c16-86b8-44b0cab38e72"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.200643 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fdc16f4c-0cde-4c16-86b8-44b0cab38e72" (UID: "fdc16f4c-0cde-4c16-86b8-44b0cab38e72"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.205683 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fdc16f4c-0cde-4c16-86b8-44b0cab38e72" (UID: "fdc16f4c-0cde-4c16-86b8-44b0cab38e72"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.211267 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-config" (OuterVolumeSpecName: "config") pod "fdc16f4c-0cde-4c16-86b8-44b0cab38e72" (UID: "fdc16f4c-0cde-4c16-86b8-44b0cab38e72"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.223206 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fdc16f4c-0cde-4c16-86b8-44b0cab38e72" (UID: "fdc16f4c-0cde-4c16-86b8-44b0cab38e72"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.257234 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.257265 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.257277 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.257288 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.257298 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5tcw\" (UniqueName: \"kubernetes.io/projected/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-kube-api-access-v5tcw\") on node \"crc\" DevicePath \"\"" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.257309 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdc16f4c-0cde-4c16-86b8-44b0cab38e72-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.470616 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-96jwb"] Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.984432 4743 generic.go:334] "Generic (PLEG): container finished" podID="a587d785-9e96-41ef-95b8-a247f530e971" containerID="77c81f06099e2b2f0d6a97d2c51541be82399170385443be1593c9e0c3e3cc0b" exitCode=0 Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.984796 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-kxj8h" Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.984533 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-96jwb" event={"ID":"a587d785-9e96-41ef-95b8-a247f530e971","Type":"ContainerDied","Data":"77c81f06099e2b2f0d6a97d2c51541be82399170385443be1593c9e0c3e3cc0b"} Nov 25 16:20:19 crc kubenswrapper[4743]: I1125 16:20:19.984868 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-96jwb" event={"ID":"a587d785-9e96-41ef-95b8-a247f530e971","Type":"ContainerStarted","Data":"07347459360edfe3cd9ac7b713b8ae5dbda3c45ddc6ecbd9c2059791318036e0"} Nov 25 16:20:20 crc kubenswrapper[4743]: I1125 16:20:20.049670 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kxj8h"] Nov 25 16:20:20 crc kubenswrapper[4743]: I1125 16:20:20.064904 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kxj8h"] Nov 25 16:20:20 crc kubenswrapper[4743]: I1125 16:20:20.077717 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:20:20 crc kubenswrapper[4743]: I1125 16:20:20.077794 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:20:20 crc kubenswrapper[4743]: I1125 16:20:20.997287 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-96jwb" event={"ID":"a587d785-9e96-41ef-95b8-a247f530e971","Type":"ContainerStarted","Data":"ffd24e0d303ffe227c358f0afe6ae891ad5dd64a9081572eada7a0bc93b42d57"} Nov 25 16:20:20 crc kubenswrapper[4743]: I1125 16:20:20.997500 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:21 crc kubenswrapper[4743]: I1125 16:20:21.021274 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-96jwb" podStartSLOduration=3.021257205 podStartE2EDuration="3.021257205s" podCreationTimestamp="2025-11-25 16:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:20:21.014126172 +0000 UTC m=+1300.135965741" watchObservedRunningTime="2025-11-25 16:20:21.021257205 +0000 UTC m=+1300.143096744" Nov 25 16:20:21 crc kubenswrapper[4743]: I1125 16:20:21.785898 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdc16f4c-0cde-4c16-86b8-44b0cab38e72" path="/var/lib/kubelet/pods/fdc16f4c-0cde-4c16-86b8-44b0cab38e72/volumes" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.047979 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-96jwb" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.113478 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-s5fsm"] Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.114752 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" podUID="eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" containerName="dnsmasq-dns" containerID="cri-o://e5445274210269b8ab5107b5e117b1cb0091ef4683f0e516818aa84614c07b56" gracePeriod=10 Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.680274 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.756359 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-dns-svc\") pod \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.756535 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-openstack-edpm-ipam\") pod \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.757204 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-dns-swift-storage-0\") pod \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.757294 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-config\") pod \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.757369 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-ovsdbserver-sb\") pod \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.757468 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-ovsdbserver-nb\") pod \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.757522 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw8ph\" (UniqueName: \"kubernetes.io/projected/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-kube-api-access-tw8ph\") pod \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\" (UID: \"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc\") " Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.769827 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-kube-api-access-tw8ph" (OuterVolumeSpecName: "kube-api-access-tw8ph") pod "eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" (UID: "eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc"). InnerVolumeSpecName "kube-api-access-tw8ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.811244 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" (UID: "eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.817716 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-config" (OuterVolumeSpecName: "config") pod "eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" (UID: "eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.822466 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" (UID: "eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.823234 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" (UID: "eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.830219 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" (UID: "eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.837533 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" (UID: "eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.864028 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.864103 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.864122 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw8ph\" (UniqueName: \"kubernetes.io/projected/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-kube-api-access-tw8ph\") on node \"crc\" DevicePath \"\"" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.864138 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.864151 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.864167 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:20:29 crc kubenswrapper[4743]: I1125 16:20:29.864179 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:20:30 crc kubenswrapper[4743]: I1125 16:20:30.104365 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" Nov 25 16:20:30 crc kubenswrapper[4743]: I1125 16:20:30.104384 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" event={"ID":"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc","Type":"ContainerDied","Data":"e5445274210269b8ab5107b5e117b1cb0091ef4683f0e516818aa84614c07b56"} Nov 25 16:20:30 crc kubenswrapper[4743]: I1125 16:20:30.104458 4743 scope.go:117] "RemoveContainer" containerID="e5445274210269b8ab5107b5e117b1cb0091ef4683f0e516818aa84614c07b56" Nov 25 16:20:30 crc kubenswrapper[4743]: I1125 16:20:30.104323 4743 generic.go:334] "Generic (PLEG): container finished" podID="eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" containerID="e5445274210269b8ab5107b5e117b1cb0091ef4683f0e516818aa84614c07b56" exitCode=0 Nov 25 16:20:30 crc kubenswrapper[4743]: I1125 16:20:30.104714 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-s5fsm" event={"ID":"eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc","Type":"ContainerDied","Data":"bec1c5390b57b9f4cd9772ba98f6edb005f6fcd880e3b7f2a4898a0355c8d8a2"} Nov 25 16:20:30 crc kubenswrapper[4743]: I1125 16:20:30.125498 4743 scope.go:117] "RemoveContainer" containerID="61addee65fa2dd43d2be4c16035a0751b2514d3d7aa988214e34add192cacc49" Nov 25 16:20:30 crc kubenswrapper[4743]: I1125 16:20:30.142332 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-s5fsm"] Nov 25 16:20:30 crc kubenswrapper[4743]: I1125 16:20:30.153531 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-s5fsm"] Nov 25 16:20:30 crc kubenswrapper[4743]: I1125 16:20:30.162235 4743 scope.go:117] "RemoveContainer" containerID="e5445274210269b8ab5107b5e117b1cb0091ef4683f0e516818aa84614c07b56" Nov 25 16:20:30 crc kubenswrapper[4743]: E1125 16:20:30.163163 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5445274210269b8ab5107b5e117b1cb0091ef4683f0e516818aa84614c07b56\": container with ID starting with e5445274210269b8ab5107b5e117b1cb0091ef4683f0e516818aa84614c07b56 not found: ID does not exist" containerID="e5445274210269b8ab5107b5e117b1cb0091ef4683f0e516818aa84614c07b56" Nov 25 16:20:30 crc kubenswrapper[4743]: I1125 16:20:30.163195 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5445274210269b8ab5107b5e117b1cb0091ef4683f0e516818aa84614c07b56"} err="failed to get container status \"e5445274210269b8ab5107b5e117b1cb0091ef4683f0e516818aa84614c07b56\": rpc error: code = NotFound desc = could not find container \"e5445274210269b8ab5107b5e117b1cb0091ef4683f0e516818aa84614c07b56\": container with ID starting with e5445274210269b8ab5107b5e117b1cb0091ef4683f0e516818aa84614c07b56 not found: ID does not exist" Nov 25 16:20:30 crc kubenswrapper[4743]: I1125 16:20:30.163217 4743 scope.go:117] "RemoveContainer" containerID="61addee65fa2dd43d2be4c16035a0751b2514d3d7aa988214e34add192cacc49" Nov 25 16:20:30 crc kubenswrapper[4743]: E1125 16:20:30.163534 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61addee65fa2dd43d2be4c16035a0751b2514d3d7aa988214e34add192cacc49\": container with ID starting with 61addee65fa2dd43d2be4c16035a0751b2514d3d7aa988214e34add192cacc49 not found: ID does not exist" containerID="61addee65fa2dd43d2be4c16035a0751b2514d3d7aa988214e34add192cacc49" Nov 25 16:20:30 crc kubenswrapper[4743]: I1125 16:20:30.163565 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61addee65fa2dd43d2be4c16035a0751b2514d3d7aa988214e34add192cacc49"} err="failed to get container status \"61addee65fa2dd43d2be4c16035a0751b2514d3d7aa988214e34add192cacc49\": rpc error: code = NotFound desc = could not find container \"61addee65fa2dd43d2be4c16035a0751b2514d3d7aa988214e34add192cacc49\": container with ID starting with 61addee65fa2dd43d2be4c16035a0751b2514d3d7aa988214e34add192cacc49 not found: ID does not exist" Nov 25 16:20:31 crc kubenswrapper[4743]: I1125 16:20:31.784532 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" path="/var/lib/kubelet/pods/eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc/volumes" Nov 25 16:20:35 crc kubenswrapper[4743]: I1125 16:20:35.146780 4743 generic.go:334] "Generic (PLEG): container finished" podID="1337639a-d66d-43cb-a7d9-487f22d1d804" containerID="cec9e7d365fb1ae9a3e6054c3ea5e8284eccd40d2189b86a6e4460f8cc231e19" exitCode=0 Nov 25 16:20:35 crc kubenswrapper[4743]: I1125 16:20:35.146871 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1337639a-d66d-43cb-a7d9-487f22d1d804","Type":"ContainerDied","Data":"cec9e7d365fb1ae9a3e6054c3ea5e8284eccd40d2189b86a6e4460f8cc231e19"} Nov 25 16:20:35 crc kubenswrapper[4743]: I1125 16:20:35.149371 4743 generic.go:334] "Generic (PLEG): container finished" podID="f54afd9a-9279-4fd3-a14a-6742d1ad9d96" containerID="e07cd32836f1c4d84a648ddbf4b09453379b6da58eed75bb655695de34a57f78" exitCode=0 Nov 25 16:20:35 crc kubenswrapper[4743]: I1125 16:20:35.149400 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f54afd9a-9279-4fd3-a14a-6742d1ad9d96","Type":"ContainerDied","Data":"e07cd32836f1c4d84a648ddbf4b09453379b6da58eed75bb655695de34a57f78"} Nov 25 16:20:36 crc kubenswrapper[4743]: I1125 16:20:36.161305 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1337639a-d66d-43cb-a7d9-487f22d1d804","Type":"ContainerStarted","Data":"9295de496eebe41d6618af012c2b20abfe42729b9f45be5d9416961f5b12966b"} Nov 25 16:20:36 crc kubenswrapper[4743]: I1125 16:20:36.162449 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 25 16:20:36 crc kubenswrapper[4743]: I1125 16:20:36.163472 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f54afd9a-9279-4fd3-a14a-6742d1ad9d96","Type":"ContainerStarted","Data":"c848db335e4005ad13d541f03b730c3f4f2cc6e2ab213f8c44f452f373cb3931"} Nov 25 16:20:36 crc kubenswrapper[4743]: I1125 16:20:36.163634 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:36 crc kubenswrapper[4743]: I1125 16:20:36.186362 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.186344436 podStartE2EDuration="36.186344436s" podCreationTimestamp="2025-11-25 16:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:20:36.1848565 +0000 UTC m=+1315.306696059" watchObservedRunningTime="2025-11-25 16:20:36.186344436 +0000 UTC m=+1315.308183985" Nov 25 16:20:36 crc kubenswrapper[4743]: I1125 16:20:36.211203 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.211185738 podStartE2EDuration="36.211185738s" podCreationTimestamp="2025-11-25 16:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:20:36.204336362 +0000 UTC m=+1315.326175921" watchObservedRunningTime="2025-11-25 16:20:36.211185738 +0000 UTC m=+1315.333025287" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.538343 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6"] Nov 25 16:20:42 crc kubenswrapper[4743]: E1125 16:20:42.539413 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" containerName="dnsmasq-dns" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.539429 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" containerName="dnsmasq-dns" Nov 25 16:20:42 crc kubenswrapper[4743]: E1125 16:20:42.539446 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" containerName="init" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.539454 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" containerName="init" Nov 25 16:20:42 crc kubenswrapper[4743]: E1125 16:20:42.539471 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc16f4c-0cde-4c16-86b8-44b0cab38e72" containerName="dnsmasq-dns" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.539479 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc16f4c-0cde-4c16-86b8-44b0cab38e72" containerName="dnsmasq-dns" Nov 25 16:20:42 crc kubenswrapper[4743]: E1125 16:20:42.539515 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc16f4c-0cde-4c16-86b8-44b0cab38e72" containerName="init" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.539524 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc16f4c-0cde-4c16-86b8-44b0cab38e72" containerName="init" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.539741 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb8f72f1-f0ac-479e-bebb-6ec10a20dcbc" containerName="dnsmasq-dns" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.539762 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc16f4c-0cde-4c16-86b8-44b0cab38e72" containerName="dnsmasq-dns" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.540502 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.548136 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.551305 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.551385 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.551468 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.555002 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6"] Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.595359 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.595844 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvr2b\" (UniqueName: \"kubernetes.io/projected/4ae17f2e-689f-4dd3-bc91-c52a218a8492-kube-api-access-nvr2b\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.596058 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.596302 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.698200 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvr2b\" (UniqueName: \"kubernetes.io/projected/4ae17f2e-689f-4dd3-bc91-c52a218a8492-kube-api-access-nvr2b\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.698300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.698365 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.698414 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.705275 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.705282 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.705575 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.715279 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvr2b\" (UniqueName: \"kubernetes.io/projected/4ae17f2e-689f-4dd3-bc91-c52a218a8492-kube-api-access-nvr2b\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:20:42 crc kubenswrapper[4743]: I1125 16:20:42.867576 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:20:43 crc kubenswrapper[4743]: I1125 16:20:43.385173 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6"] Nov 25 16:20:43 crc kubenswrapper[4743]: W1125 16:20:43.386817 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ae17f2e_689f_4dd3_bc91_c52a218a8492.slice/crio-e52b9f698db6ab50e7a837d92570b0605559fea1f46dbb6e3cf81de348e2791f WatchSource:0}: Error finding container e52b9f698db6ab50e7a837d92570b0605559fea1f46dbb6e3cf81de348e2791f: Status 404 returned error can't find the container with id e52b9f698db6ab50e7a837d92570b0605559fea1f46dbb6e3cf81de348e2791f Nov 25 16:20:44 crc kubenswrapper[4743]: I1125 16:20:44.230268 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" event={"ID":"4ae17f2e-689f-4dd3-bc91-c52a218a8492","Type":"ContainerStarted","Data":"e52b9f698db6ab50e7a837d92570b0605559fea1f46dbb6e3cf81de348e2791f"} Nov 25 16:20:50 crc kubenswrapper[4743]: I1125 16:20:50.077816 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:20:50 crc kubenswrapper[4743]: I1125 16:20:50.078349 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:20:50 crc kubenswrapper[4743]: I1125 16:20:50.397078 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 25 16:20:50 crc kubenswrapper[4743]: I1125 16:20:50.461854 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 25 16:20:57 crc kubenswrapper[4743]: I1125 16:20:57.356412 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" event={"ID":"4ae17f2e-689f-4dd3-bc91-c52a218a8492","Type":"ContainerStarted","Data":"12b04288297261fc424a00cb64c9bb389e02817136bf43cbe0288e688688ff57"} Nov 25 16:20:57 crc kubenswrapper[4743]: I1125 16:20:57.375808 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" podStartSLOduration=2.734119115 podStartE2EDuration="15.37579162s" podCreationTimestamp="2025-11-25 16:20:42 +0000 UTC" firstStartedPulling="2025-11-25 16:20:43.388699235 +0000 UTC m=+1322.510538784" lastFinishedPulling="2025-11-25 16:20:56.03037174 +0000 UTC m=+1335.152211289" observedRunningTime="2025-11-25 16:20:57.369946056 +0000 UTC m=+1336.491785615" watchObservedRunningTime="2025-11-25 16:20:57.37579162 +0000 UTC m=+1336.497631159" Nov 25 16:21:07 crc kubenswrapper[4743]: I1125 16:21:07.747755 4743 scope.go:117] "RemoveContainer" containerID="3ae3f9b51fe57af31c27665590f2965eae98b618081c1e9cabe7276689de9c1b" Nov 25 16:21:20 crc kubenswrapper[4743]: I1125 16:21:20.077892 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:21:20 crc kubenswrapper[4743]: I1125 16:21:20.078476 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:21:20 crc kubenswrapper[4743]: I1125 16:21:20.078519 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 16:21:20 crc kubenswrapper[4743]: I1125 16:21:20.079234 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c7a1e69a5b625582ce315759353f5fdddaa5af76ffaa857f59e8fe101fdc1d28"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:21:20 crc kubenswrapper[4743]: I1125 16:21:20.079297 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://c7a1e69a5b625582ce315759353f5fdddaa5af76ffaa857f59e8fe101fdc1d28" gracePeriod=600 Nov 25 16:21:20 crc kubenswrapper[4743]: I1125 16:21:20.585754 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="c7a1e69a5b625582ce315759353f5fdddaa5af76ffaa857f59e8fe101fdc1d28" exitCode=0 Nov 25 16:21:20 crc kubenswrapper[4743]: I1125 16:21:20.585813 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"c7a1e69a5b625582ce315759353f5fdddaa5af76ffaa857f59e8fe101fdc1d28"} Nov 25 16:21:20 crc kubenswrapper[4743]: I1125 16:21:20.585858 4743 scope.go:117] "RemoveContainer" containerID="22876c3200d1bd282f05d310d56d80b6ce637f5b7335a83f68f3eb1b6ac3ce7a" Nov 25 16:21:21 crc kubenswrapper[4743]: I1125 16:21:21.599472 4743 generic.go:334] "Generic (PLEG): container finished" podID="4ae17f2e-689f-4dd3-bc91-c52a218a8492" containerID="12b04288297261fc424a00cb64c9bb389e02817136bf43cbe0288e688688ff57" exitCode=0 Nov 25 16:21:21 crc kubenswrapper[4743]: I1125 16:21:21.599568 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" event={"ID":"4ae17f2e-689f-4dd3-bc91-c52a218a8492","Type":"ContainerDied","Data":"12b04288297261fc424a00cb64c9bb389e02817136bf43cbe0288e688688ff57"} Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.004407 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.162747 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-ssh-key\") pod \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.162848 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-inventory\") pod \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.163030 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-repo-setup-combined-ca-bundle\") pod \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.163096 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvr2b\" (UniqueName: \"kubernetes.io/projected/4ae17f2e-689f-4dd3-bc91-c52a218a8492-kube-api-access-nvr2b\") pod \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\" (UID: \"4ae17f2e-689f-4dd3-bc91-c52a218a8492\") " Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.168158 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "4ae17f2e-689f-4dd3-bc91-c52a218a8492" (UID: "4ae17f2e-689f-4dd3-bc91-c52a218a8492"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.183146 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae17f2e-689f-4dd3-bc91-c52a218a8492-kube-api-access-nvr2b" (OuterVolumeSpecName: "kube-api-access-nvr2b") pod "4ae17f2e-689f-4dd3-bc91-c52a218a8492" (UID: "4ae17f2e-689f-4dd3-bc91-c52a218a8492"). InnerVolumeSpecName "kube-api-access-nvr2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.198750 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4ae17f2e-689f-4dd3-bc91-c52a218a8492" (UID: "4ae17f2e-689f-4dd3-bc91-c52a218a8492"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.206423 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-inventory" (OuterVolumeSpecName: "inventory") pod "4ae17f2e-689f-4dd3-bc91-c52a218a8492" (UID: "4ae17f2e-689f-4dd3-bc91-c52a218a8492"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.265653 4743 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.265685 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvr2b\" (UniqueName: \"kubernetes.io/projected/4ae17f2e-689f-4dd3-bc91-c52a218a8492-kube-api-access-nvr2b\") on node \"crc\" DevicePath \"\"" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.265695 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.265703 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ae17f2e-689f-4dd3-bc91-c52a218a8492-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.624767 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" event={"ID":"4ae17f2e-689f-4dd3-bc91-c52a218a8492","Type":"ContainerDied","Data":"e52b9f698db6ab50e7a837d92570b0605559fea1f46dbb6e3cf81de348e2791f"} Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.624823 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.624831 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e52b9f698db6ab50e7a837d92570b0605559fea1f46dbb6e3cf81de348e2791f" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.712278 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh"] Nov 25 16:21:23 crc kubenswrapper[4743]: E1125 16:21:23.712799 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae17f2e-689f-4dd3-bc91-c52a218a8492" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.712828 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae17f2e-689f-4dd3-bc91-c52a218a8492" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.713119 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae17f2e-689f-4dd3-bc91-c52a218a8492" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.713891 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.717235 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.717303 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.717789 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.719466 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.729063 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh"] Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.877626 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36ce5802-7073-425c-bd4e-1b770cfacd49-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdkdh\" (UID: \"36ce5802-7073-425c-bd4e-1b770cfacd49\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.877793 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36ce5802-7073-425c-bd4e-1b770cfacd49-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdkdh\" (UID: \"36ce5802-7073-425c-bd4e-1b770cfacd49\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.877924 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx756\" (UniqueName: \"kubernetes.io/projected/36ce5802-7073-425c-bd4e-1b770cfacd49-kube-api-access-tx756\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdkdh\" (UID: \"36ce5802-7073-425c-bd4e-1b770cfacd49\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.980126 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36ce5802-7073-425c-bd4e-1b770cfacd49-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdkdh\" (UID: \"36ce5802-7073-425c-bd4e-1b770cfacd49\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.980230 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx756\" (UniqueName: \"kubernetes.io/projected/36ce5802-7073-425c-bd4e-1b770cfacd49-kube-api-access-tx756\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdkdh\" (UID: \"36ce5802-7073-425c-bd4e-1b770cfacd49\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.980309 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36ce5802-7073-425c-bd4e-1b770cfacd49-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdkdh\" (UID: \"36ce5802-7073-425c-bd4e-1b770cfacd49\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.986229 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36ce5802-7073-425c-bd4e-1b770cfacd49-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdkdh\" (UID: \"36ce5802-7073-425c-bd4e-1b770cfacd49\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.986443 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36ce5802-7073-425c-bd4e-1b770cfacd49-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdkdh\" (UID: \"36ce5802-7073-425c-bd4e-1b770cfacd49\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" Nov 25 16:21:23 crc kubenswrapper[4743]: I1125 16:21:23.996835 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx756\" (UniqueName: \"kubernetes.io/projected/36ce5802-7073-425c-bd4e-1b770cfacd49-kube-api-access-tx756\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fdkdh\" (UID: \"36ce5802-7073-425c-bd4e-1b770cfacd49\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" Nov 25 16:21:24 crc kubenswrapper[4743]: I1125 16:21:24.034341 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" Nov 25 16:21:24 crc kubenswrapper[4743]: I1125 16:21:24.534538 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh"] Nov 25 16:21:24 crc kubenswrapper[4743]: I1125 16:21:24.633649 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" event={"ID":"36ce5802-7073-425c-bd4e-1b770cfacd49","Type":"ContainerStarted","Data":"ab31ea5bd8dcbea7fc180e16fcf522cbd1b9e8e0e19cf6c52b957f4155843e8e"} Nov 25 16:21:39 crc kubenswrapper[4743]: I1125 16:21:39.789547 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806"} Nov 25 16:21:41 crc kubenswrapper[4743]: I1125 16:21:41.825864 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" event={"ID":"36ce5802-7073-425c-bd4e-1b770cfacd49","Type":"ContainerStarted","Data":"d62640d12ef2653e19adf125710bebbad6ce72b643807eb39290cd26ddd32896"} Nov 25 16:21:42 crc kubenswrapper[4743]: I1125 16:21:42.857684 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" podStartSLOduration=3.824996133 podStartE2EDuration="19.857664376s" podCreationTimestamp="2025-11-25 16:21:23 +0000 UTC" firstStartedPulling="2025-11-25 16:21:24.540027667 +0000 UTC m=+1363.661867216" lastFinishedPulling="2025-11-25 16:21:40.57269588 +0000 UTC m=+1379.694535459" observedRunningTime="2025-11-25 16:21:42.855662052 +0000 UTC m=+1381.977501601" watchObservedRunningTime="2025-11-25 16:21:42.857664376 +0000 UTC m=+1381.979503935" Nov 25 16:21:44 crc kubenswrapper[4743]: I1125 16:21:44.854348 4743 generic.go:334] "Generic (PLEG): container finished" podID="36ce5802-7073-425c-bd4e-1b770cfacd49" containerID="d62640d12ef2653e19adf125710bebbad6ce72b643807eb39290cd26ddd32896" exitCode=0 Nov 25 16:21:44 crc kubenswrapper[4743]: I1125 16:21:44.854411 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" event={"ID":"36ce5802-7073-425c-bd4e-1b770cfacd49","Type":"ContainerDied","Data":"d62640d12ef2653e19adf125710bebbad6ce72b643807eb39290cd26ddd32896"} Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.213240 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.304744 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36ce5802-7073-425c-bd4e-1b770cfacd49-inventory\") pod \"36ce5802-7073-425c-bd4e-1b770cfacd49\" (UID: \"36ce5802-7073-425c-bd4e-1b770cfacd49\") " Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.304931 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36ce5802-7073-425c-bd4e-1b770cfacd49-ssh-key\") pod \"36ce5802-7073-425c-bd4e-1b770cfacd49\" (UID: \"36ce5802-7073-425c-bd4e-1b770cfacd49\") " Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.305245 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx756\" (UniqueName: \"kubernetes.io/projected/36ce5802-7073-425c-bd4e-1b770cfacd49-kube-api-access-tx756\") pod \"36ce5802-7073-425c-bd4e-1b770cfacd49\" (UID: \"36ce5802-7073-425c-bd4e-1b770cfacd49\") " Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.310194 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ce5802-7073-425c-bd4e-1b770cfacd49-kube-api-access-tx756" (OuterVolumeSpecName: "kube-api-access-tx756") pod "36ce5802-7073-425c-bd4e-1b770cfacd49" (UID: "36ce5802-7073-425c-bd4e-1b770cfacd49"). InnerVolumeSpecName "kube-api-access-tx756". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.332417 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ce5802-7073-425c-bd4e-1b770cfacd49-inventory" (OuterVolumeSpecName: "inventory") pod "36ce5802-7073-425c-bd4e-1b770cfacd49" (UID: "36ce5802-7073-425c-bd4e-1b770cfacd49"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.332583 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ce5802-7073-425c-bd4e-1b770cfacd49-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "36ce5802-7073-425c-bd4e-1b770cfacd49" (UID: "36ce5802-7073-425c-bd4e-1b770cfacd49"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.407654 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx756\" (UniqueName: \"kubernetes.io/projected/36ce5802-7073-425c-bd4e-1b770cfacd49-kube-api-access-tx756\") on node \"crc\" DevicePath \"\"" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.407693 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36ce5802-7073-425c-bd4e-1b770cfacd49-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.407707 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36ce5802-7073-425c-bd4e-1b770cfacd49-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.878795 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" event={"ID":"36ce5802-7073-425c-bd4e-1b770cfacd49","Type":"ContainerDied","Data":"ab31ea5bd8dcbea7fc180e16fcf522cbd1b9e8e0e19cf6c52b957f4155843e8e"} Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.878846 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab31ea5bd8dcbea7fc180e16fcf522cbd1b9e8e0e19cf6c52b957f4155843e8e" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.879172 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fdkdh" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.955583 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd"] Nov 25 16:21:46 crc kubenswrapper[4743]: E1125 16:21:46.956020 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ce5802-7073-425c-bd4e-1b770cfacd49" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.956040 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ce5802-7073-425c-bd4e-1b770cfacd49" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.956255 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="36ce5802-7073-425c-bd4e-1b770cfacd49" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.956962 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.961878 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.962077 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.962145 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.962153 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:21:46 crc kubenswrapper[4743]: I1125 16:21:46.967001 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd"] Nov 25 16:21:47 crc kubenswrapper[4743]: I1125 16:21:47.120942 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:21:47 crc kubenswrapper[4743]: I1125 16:21:47.121009 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:21:47 crc kubenswrapper[4743]: I1125 16:21:47.121271 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:21:47 crc kubenswrapper[4743]: I1125 16:21:47.121350 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzbg8\" (UniqueName: \"kubernetes.io/projected/14bc3c31-f23e-4c67-a989-e85613bd5607-kube-api-access-wzbg8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:21:47 crc kubenswrapper[4743]: I1125 16:21:47.223008 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:21:47 crc kubenswrapper[4743]: I1125 16:21:47.223073 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzbg8\" (UniqueName: \"kubernetes.io/projected/14bc3c31-f23e-4c67-a989-e85613bd5607-kube-api-access-wzbg8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:21:47 crc kubenswrapper[4743]: I1125 16:21:47.223166 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:21:47 crc kubenswrapper[4743]: I1125 16:21:47.223218 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:21:47 crc kubenswrapper[4743]: I1125 16:21:47.227494 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:21:47 crc kubenswrapper[4743]: I1125 16:21:47.228498 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:21:47 crc kubenswrapper[4743]: I1125 16:21:47.232064 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:21:47 crc kubenswrapper[4743]: I1125 16:21:47.240079 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzbg8\" (UniqueName: \"kubernetes.io/projected/14bc3c31-f23e-4c67-a989-e85613bd5607-kube-api-access-wzbg8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:21:47 crc kubenswrapper[4743]: I1125 16:21:47.274142 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:21:47 crc kubenswrapper[4743]: I1125 16:21:47.830985 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd"] Nov 25 16:21:47 crc kubenswrapper[4743]: W1125 16:21:47.835832 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14bc3c31_f23e_4c67_a989_e85613bd5607.slice/crio-f5d822f5901b25ae2b3c0ea06c600d1516acfa82c8f8e0566f531243102c1703 WatchSource:0}: Error finding container f5d822f5901b25ae2b3c0ea06c600d1516acfa82c8f8e0566f531243102c1703: Status 404 returned error can't find the container with id f5d822f5901b25ae2b3c0ea06c600d1516acfa82c8f8e0566f531243102c1703 Nov 25 16:21:47 crc kubenswrapper[4743]: I1125 16:21:47.891999 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" event={"ID":"14bc3c31-f23e-4c67-a989-e85613bd5607","Type":"ContainerStarted","Data":"f5d822f5901b25ae2b3c0ea06c600d1516acfa82c8f8e0566f531243102c1703"} Nov 25 16:21:49 crc kubenswrapper[4743]: I1125 16:21:49.910817 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" event={"ID":"14bc3c31-f23e-4c67-a989-e85613bd5607","Type":"ContainerStarted","Data":"789c3db1cc79db74b88f6c1ac0c172cdc776cd75634f6b75119abf8658c2fc00"} Nov 25 16:21:49 crc kubenswrapper[4743]: I1125 16:21:49.936178 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" podStartSLOduration=2.554777128 podStartE2EDuration="3.936157573s" podCreationTimestamp="2025-11-25 16:21:46 +0000 UTC" firstStartedPulling="2025-11-25 16:21:47.838609429 +0000 UTC m=+1386.960448968" lastFinishedPulling="2025-11-25 16:21:49.219989864 +0000 UTC m=+1388.341829413" observedRunningTime="2025-11-25 16:21:49.923162484 +0000 UTC m=+1389.045002053" watchObservedRunningTime="2025-11-25 16:21:49.936157573 +0000 UTC m=+1389.057997122" Nov 25 16:22:07 crc kubenswrapper[4743]: I1125 16:22:07.822365 4743 scope.go:117] "RemoveContainer" containerID="4e14ec9661e427d9de11a7735eb9aa7c07fc4a0622749c8d48ae6dcc47377e33" Nov 25 16:22:07 crc kubenswrapper[4743]: I1125 16:22:07.883040 4743 scope.go:117] "RemoveContainer" containerID="4f0da2d223788415cd89ad3d4f0a8e3ec1c0c924891e4d67b0c6c839f9bb917b" Nov 25 16:22:38 crc kubenswrapper[4743]: I1125 16:22:38.320018 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n6lxw"] Nov 25 16:22:38 crc kubenswrapper[4743]: I1125 16:22:38.324315 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:38 crc kubenswrapper[4743]: I1125 16:22:38.335656 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n6lxw"] Nov 25 16:22:38 crc kubenswrapper[4743]: I1125 16:22:38.415580 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqfdq\" (UniqueName: \"kubernetes.io/projected/617dd55b-2dac-48c1-9133-add0b8cc8a83-kube-api-access-vqfdq\") pod \"community-operators-n6lxw\" (UID: \"617dd55b-2dac-48c1-9133-add0b8cc8a83\") " pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:38 crc kubenswrapper[4743]: I1125 16:22:38.415684 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/617dd55b-2dac-48c1-9133-add0b8cc8a83-utilities\") pod \"community-operators-n6lxw\" (UID: \"617dd55b-2dac-48c1-9133-add0b8cc8a83\") " pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:38 crc kubenswrapper[4743]: I1125 16:22:38.415720 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/617dd55b-2dac-48c1-9133-add0b8cc8a83-catalog-content\") pod \"community-operators-n6lxw\" (UID: \"617dd55b-2dac-48c1-9133-add0b8cc8a83\") " pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:38 crc kubenswrapper[4743]: I1125 16:22:38.517630 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqfdq\" (UniqueName: \"kubernetes.io/projected/617dd55b-2dac-48c1-9133-add0b8cc8a83-kube-api-access-vqfdq\") pod \"community-operators-n6lxw\" (UID: \"617dd55b-2dac-48c1-9133-add0b8cc8a83\") " pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:38 crc kubenswrapper[4743]: I1125 16:22:38.518192 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/617dd55b-2dac-48c1-9133-add0b8cc8a83-utilities\") pod \"community-operators-n6lxw\" (UID: \"617dd55b-2dac-48c1-9133-add0b8cc8a83\") " pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:38 crc kubenswrapper[4743]: I1125 16:22:38.518309 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/617dd55b-2dac-48c1-9133-add0b8cc8a83-catalog-content\") pod \"community-operators-n6lxw\" (UID: \"617dd55b-2dac-48c1-9133-add0b8cc8a83\") " pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:38 crc kubenswrapper[4743]: I1125 16:22:38.518743 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/617dd55b-2dac-48c1-9133-add0b8cc8a83-utilities\") pod \"community-operators-n6lxw\" (UID: \"617dd55b-2dac-48c1-9133-add0b8cc8a83\") " pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:38 crc kubenswrapper[4743]: I1125 16:22:38.518868 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/617dd55b-2dac-48c1-9133-add0b8cc8a83-catalog-content\") pod \"community-operators-n6lxw\" (UID: \"617dd55b-2dac-48c1-9133-add0b8cc8a83\") " pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:38 crc kubenswrapper[4743]: I1125 16:22:38.536771 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqfdq\" (UniqueName: \"kubernetes.io/projected/617dd55b-2dac-48c1-9133-add0b8cc8a83-kube-api-access-vqfdq\") pod \"community-operators-n6lxw\" (UID: \"617dd55b-2dac-48c1-9133-add0b8cc8a83\") " pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:38 crc kubenswrapper[4743]: I1125 16:22:38.643073 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:39 crc kubenswrapper[4743]: I1125 16:22:39.146642 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n6lxw"] Nov 25 16:22:39 crc kubenswrapper[4743]: I1125 16:22:39.398993 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6lxw" event={"ID":"617dd55b-2dac-48c1-9133-add0b8cc8a83","Type":"ContainerStarted","Data":"653a0031f6c9d17a980a92fc7ad106046928ec87c0b6f09cb63e619a1c113f23"} Nov 25 16:22:40 crc kubenswrapper[4743]: I1125 16:22:40.409240 4743 generic.go:334] "Generic (PLEG): container finished" podID="617dd55b-2dac-48c1-9133-add0b8cc8a83" containerID="8d6888d3dd4c794deab90e873d1f85f1474a373d1f3ca83162eb8a03dad584ba" exitCode=0 Nov 25 16:22:40 crc kubenswrapper[4743]: I1125 16:22:40.409469 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6lxw" event={"ID":"617dd55b-2dac-48c1-9133-add0b8cc8a83","Type":"ContainerDied","Data":"8d6888d3dd4c794deab90e873d1f85f1474a373d1f3ca83162eb8a03dad584ba"} Nov 25 16:22:41 crc kubenswrapper[4743]: I1125 16:22:41.425252 4743 generic.go:334] "Generic (PLEG): container finished" podID="617dd55b-2dac-48c1-9133-add0b8cc8a83" containerID="e3e4ee4a794071a6fa73b87672b3d57cd666630ccf2d1590d053bca93dca024a" exitCode=0 Nov 25 16:22:41 crc kubenswrapper[4743]: I1125 16:22:41.425331 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6lxw" event={"ID":"617dd55b-2dac-48c1-9133-add0b8cc8a83","Type":"ContainerDied","Data":"e3e4ee4a794071a6fa73b87672b3d57cd666630ccf2d1590d053bca93dca024a"} Nov 25 16:22:42 crc kubenswrapper[4743]: I1125 16:22:42.437234 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6lxw" event={"ID":"617dd55b-2dac-48c1-9133-add0b8cc8a83","Type":"ContainerStarted","Data":"d19eda967ca55c4ee844fc3bb11644c6a2884d262d1d6ffb8e3b8b91804820ec"} Nov 25 16:22:42 crc kubenswrapper[4743]: I1125 16:22:42.457235 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n6lxw" podStartSLOduration=2.707346275 podStartE2EDuration="4.457218622s" podCreationTimestamp="2025-11-25 16:22:38 +0000 UTC" firstStartedPulling="2025-11-25 16:22:40.411024315 +0000 UTC m=+1439.532863864" lastFinishedPulling="2025-11-25 16:22:42.160896642 +0000 UTC m=+1441.282736211" observedRunningTime="2025-11-25 16:22:42.45429466 +0000 UTC m=+1441.576134209" watchObservedRunningTime="2025-11-25 16:22:42.457218622 +0000 UTC m=+1441.579058161" Nov 25 16:22:48 crc kubenswrapper[4743]: I1125 16:22:48.644042 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:48 crc kubenswrapper[4743]: I1125 16:22:48.644610 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:48 crc kubenswrapper[4743]: I1125 16:22:48.688251 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:49 crc kubenswrapper[4743]: I1125 16:22:49.547015 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:49 crc kubenswrapper[4743]: I1125 16:22:49.601454 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n6lxw"] Nov 25 16:22:51 crc kubenswrapper[4743]: I1125 16:22:51.522416 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n6lxw" podUID="617dd55b-2dac-48c1-9133-add0b8cc8a83" containerName="registry-server" containerID="cri-o://d19eda967ca55c4ee844fc3bb11644c6a2884d262d1d6ffb8e3b8b91804820ec" gracePeriod=2 Nov 25 16:22:51 crc kubenswrapper[4743]: I1125 16:22:51.971076 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.074528 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqfdq\" (UniqueName: \"kubernetes.io/projected/617dd55b-2dac-48c1-9133-add0b8cc8a83-kube-api-access-vqfdq\") pod \"617dd55b-2dac-48c1-9133-add0b8cc8a83\" (UID: \"617dd55b-2dac-48c1-9133-add0b8cc8a83\") " Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.074638 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/617dd55b-2dac-48c1-9133-add0b8cc8a83-utilities\") pod \"617dd55b-2dac-48c1-9133-add0b8cc8a83\" (UID: \"617dd55b-2dac-48c1-9133-add0b8cc8a83\") " Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.075106 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/617dd55b-2dac-48c1-9133-add0b8cc8a83-catalog-content\") pod \"617dd55b-2dac-48c1-9133-add0b8cc8a83\" (UID: \"617dd55b-2dac-48c1-9133-add0b8cc8a83\") " Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.075505 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/617dd55b-2dac-48c1-9133-add0b8cc8a83-utilities" (OuterVolumeSpecName: "utilities") pod "617dd55b-2dac-48c1-9133-add0b8cc8a83" (UID: "617dd55b-2dac-48c1-9133-add0b8cc8a83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.076008 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/617dd55b-2dac-48c1-9133-add0b8cc8a83-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.082481 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/617dd55b-2dac-48c1-9133-add0b8cc8a83-kube-api-access-vqfdq" (OuterVolumeSpecName: "kube-api-access-vqfdq") pod "617dd55b-2dac-48c1-9133-add0b8cc8a83" (UID: "617dd55b-2dac-48c1-9133-add0b8cc8a83"). InnerVolumeSpecName "kube-api-access-vqfdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.156088 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/617dd55b-2dac-48c1-9133-add0b8cc8a83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "617dd55b-2dac-48c1-9133-add0b8cc8a83" (UID: "617dd55b-2dac-48c1-9133-add0b8cc8a83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.177159 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/617dd55b-2dac-48c1-9133-add0b8cc8a83-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.177195 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqfdq\" (UniqueName: \"kubernetes.io/projected/617dd55b-2dac-48c1-9133-add0b8cc8a83-kube-api-access-vqfdq\") on node \"crc\" DevicePath \"\"" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.536343 4743 generic.go:334] "Generic (PLEG): container finished" podID="617dd55b-2dac-48c1-9133-add0b8cc8a83" containerID="d19eda967ca55c4ee844fc3bb11644c6a2884d262d1d6ffb8e3b8b91804820ec" exitCode=0 Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.536444 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6lxw" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.536465 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6lxw" event={"ID":"617dd55b-2dac-48c1-9133-add0b8cc8a83","Type":"ContainerDied","Data":"d19eda967ca55c4ee844fc3bb11644c6a2884d262d1d6ffb8e3b8b91804820ec"} Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.536571 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6lxw" event={"ID":"617dd55b-2dac-48c1-9133-add0b8cc8a83","Type":"ContainerDied","Data":"653a0031f6c9d17a980a92fc7ad106046928ec87c0b6f09cb63e619a1c113f23"} Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.536615 4743 scope.go:117] "RemoveContainer" containerID="d19eda967ca55c4ee844fc3bb11644c6a2884d262d1d6ffb8e3b8b91804820ec" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.588316 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n6lxw"] Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.592539 4743 scope.go:117] "RemoveContainer" containerID="e3e4ee4a794071a6fa73b87672b3d57cd666630ccf2d1590d053bca93dca024a" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.599994 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n6lxw"] Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.624709 4743 scope.go:117] "RemoveContainer" containerID="8d6888d3dd4c794deab90e873d1f85f1474a373d1f3ca83162eb8a03dad584ba" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.657682 4743 scope.go:117] "RemoveContainer" containerID="d19eda967ca55c4ee844fc3bb11644c6a2884d262d1d6ffb8e3b8b91804820ec" Nov 25 16:22:52 crc kubenswrapper[4743]: E1125 16:22:52.658338 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d19eda967ca55c4ee844fc3bb11644c6a2884d262d1d6ffb8e3b8b91804820ec\": container with ID starting with d19eda967ca55c4ee844fc3bb11644c6a2884d262d1d6ffb8e3b8b91804820ec not found: ID does not exist" containerID="d19eda967ca55c4ee844fc3bb11644c6a2884d262d1d6ffb8e3b8b91804820ec" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.658391 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d19eda967ca55c4ee844fc3bb11644c6a2884d262d1d6ffb8e3b8b91804820ec"} err="failed to get container status \"d19eda967ca55c4ee844fc3bb11644c6a2884d262d1d6ffb8e3b8b91804820ec\": rpc error: code = NotFound desc = could not find container \"d19eda967ca55c4ee844fc3bb11644c6a2884d262d1d6ffb8e3b8b91804820ec\": container with ID starting with d19eda967ca55c4ee844fc3bb11644c6a2884d262d1d6ffb8e3b8b91804820ec not found: ID does not exist" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.658429 4743 scope.go:117] "RemoveContainer" containerID="e3e4ee4a794071a6fa73b87672b3d57cd666630ccf2d1590d053bca93dca024a" Nov 25 16:22:52 crc kubenswrapper[4743]: E1125 16:22:52.658907 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3e4ee4a794071a6fa73b87672b3d57cd666630ccf2d1590d053bca93dca024a\": container with ID starting with e3e4ee4a794071a6fa73b87672b3d57cd666630ccf2d1590d053bca93dca024a not found: ID does not exist" containerID="e3e4ee4a794071a6fa73b87672b3d57cd666630ccf2d1590d053bca93dca024a" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.658947 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3e4ee4a794071a6fa73b87672b3d57cd666630ccf2d1590d053bca93dca024a"} err="failed to get container status \"e3e4ee4a794071a6fa73b87672b3d57cd666630ccf2d1590d053bca93dca024a\": rpc error: code = NotFound desc = could not find container \"e3e4ee4a794071a6fa73b87672b3d57cd666630ccf2d1590d053bca93dca024a\": container with ID starting with e3e4ee4a794071a6fa73b87672b3d57cd666630ccf2d1590d053bca93dca024a not found: ID does not exist" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.658975 4743 scope.go:117] "RemoveContainer" containerID="8d6888d3dd4c794deab90e873d1f85f1474a373d1f3ca83162eb8a03dad584ba" Nov 25 16:22:52 crc kubenswrapper[4743]: E1125 16:22:52.659489 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6888d3dd4c794deab90e873d1f85f1474a373d1f3ca83162eb8a03dad584ba\": container with ID starting with 8d6888d3dd4c794deab90e873d1f85f1474a373d1f3ca83162eb8a03dad584ba not found: ID does not exist" containerID="8d6888d3dd4c794deab90e873d1f85f1474a373d1f3ca83162eb8a03dad584ba" Nov 25 16:22:52 crc kubenswrapper[4743]: I1125 16:22:52.659526 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6888d3dd4c794deab90e873d1f85f1474a373d1f3ca83162eb8a03dad584ba"} err="failed to get container status \"8d6888d3dd4c794deab90e873d1f85f1474a373d1f3ca83162eb8a03dad584ba\": rpc error: code = NotFound desc = could not find container \"8d6888d3dd4c794deab90e873d1f85f1474a373d1f3ca83162eb8a03dad584ba\": container with ID starting with 8d6888d3dd4c794deab90e873d1f85f1474a373d1f3ca83162eb8a03dad584ba not found: ID does not exist" Nov 25 16:22:53 crc kubenswrapper[4743]: I1125 16:22:53.786527 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="617dd55b-2dac-48c1-9133-add0b8cc8a83" path="/var/lib/kubelet/pods/617dd55b-2dac-48c1-9133-add0b8cc8a83/volumes" Nov 25 16:23:07 crc kubenswrapper[4743]: I1125 16:23:07.959656 4743 scope.go:117] "RemoveContainer" containerID="77ca5aae3db4e70f2c1b6e0b7d9ad1512008f38e516e6c097311a0c55044b50a" Nov 25 16:23:07 crc kubenswrapper[4743]: I1125 16:23:07.996016 4743 scope.go:117] "RemoveContainer" containerID="c53caca7d07a07f9d55d1062d2c96b6a0742a755f21faeb5ed134845cf98bfc3" Nov 25 16:23:08 crc kubenswrapper[4743]: I1125 16:23:08.024332 4743 scope.go:117] "RemoveContainer" containerID="1270cefdd71d45d8ff275f0666d11f7b39999fb327c62c89be28b6a053603b41" Nov 25 16:23:08 crc kubenswrapper[4743]: I1125 16:23:08.059077 4743 scope.go:117] "RemoveContainer" containerID="e07b1cb797bd7b8e627e801949d1ca9f975054d787e04f2628340ec9a250ce50" Nov 25 16:23:08 crc kubenswrapper[4743]: I1125 16:23:08.125866 4743 scope.go:117] "RemoveContainer" containerID="93ef641d94e456b042a83b541b33e5a7648530f145123f3b9410d769eb45ed36" Nov 25 16:23:50 crc kubenswrapper[4743]: I1125 16:23:50.077258 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:23:50 crc kubenswrapper[4743]: I1125 16:23:50.077857 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.035604 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zg2sz"] Nov 25 16:24:00 crc kubenswrapper[4743]: E1125 16:24:00.036379 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617dd55b-2dac-48c1-9133-add0b8cc8a83" containerName="registry-server" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.036390 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="617dd55b-2dac-48c1-9133-add0b8cc8a83" containerName="registry-server" Nov 25 16:24:00 crc kubenswrapper[4743]: E1125 16:24:00.036404 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617dd55b-2dac-48c1-9133-add0b8cc8a83" containerName="extract-content" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.036410 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="617dd55b-2dac-48c1-9133-add0b8cc8a83" containerName="extract-content" Nov 25 16:24:00 crc kubenswrapper[4743]: E1125 16:24:00.036424 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617dd55b-2dac-48c1-9133-add0b8cc8a83" containerName="extract-utilities" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.036430 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="617dd55b-2dac-48c1-9133-add0b8cc8a83" containerName="extract-utilities" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.036688 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="617dd55b-2dac-48c1-9133-add0b8cc8a83" containerName="registry-server" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.038082 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.054015 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zg2sz"] Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.171437 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-862s4\" (UniqueName: \"kubernetes.io/projected/b8e2f79b-99c9-4038-b11e-edd5346d6e04-kube-api-access-862s4\") pod \"certified-operators-zg2sz\" (UID: \"b8e2f79b-99c9-4038-b11e-edd5346d6e04\") " pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.171680 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e2f79b-99c9-4038-b11e-edd5346d6e04-catalog-content\") pod \"certified-operators-zg2sz\" (UID: \"b8e2f79b-99c9-4038-b11e-edd5346d6e04\") " pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.171728 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e2f79b-99c9-4038-b11e-edd5346d6e04-utilities\") pod \"certified-operators-zg2sz\" (UID: \"b8e2f79b-99c9-4038-b11e-edd5346d6e04\") " pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.273191 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-862s4\" (UniqueName: \"kubernetes.io/projected/b8e2f79b-99c9-4038-b11e-edd5346d6e04-kube-api-access-862s4\") pod \"certified-operators-zg2sz\" (UID: \"b8e2f79b-99c9-4038-b11e-edd5346d6e04\") " pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.273390 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e2f79b-99c9-4038-b11e-edd5346d6e04-catalog-content\") pod \"certified-operators-zg2sz\" (UID: \"b8e2f79b-99c9-4038-b11e-edd5346d6e04\") " pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.273449 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e2f79b-99c9-4038-b11e-edd5346d6e04-utilities\") pod \"certified-operators-zg2sz\" (UID: \"b8e2f79b-99c9-4038-b11e-edd5346d6e04\") " pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.274070 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e2f79b-99c9-4038-b11e-edd5346d6e04-catalog-content\") pod \"certified-operators-zg2sz\" (UID: \"b8e2f79b-99c9-4038-b11e-edd5346d6e04\") " pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.274131 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e2f79b-99c9-4038-b11e-edd5346d6e04-utilities\") pod \"certified-operators-zg2sz\" (UID: \"b8e2f79b-99c9-4038-b11e-edd5346d6e04\") " pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.294538 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-862s4\" (UniqueName: \"kubernetes.io/projected/b8e2f79b-99c9-4038-b11e-edd5346d6e04-kube-api-access-862s4\") pod \"certified-operators-zg2sz\" (UID: \"b8e2f79b-99c9-4038-b11e-edd5346d6e04\") " pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.363717 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:00 crc kubenswrapper[4743]: I1125 16:24:00.838307 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zg2sz"] Nov 25 16:24:01 crc kubenswrapper[4743]: I1125 16:24:01.157841 4743 generic.go:334] "Generic (PLEG): container finished" podID="b8e2f79b-99c9-4038-b11e-edd5346d6e04" containerID="40d10de5a37fea0e358f304e56e5f3c8169258c4bb631a40db401668105ecaec" exitCode=0 Nov 25 16:24:01 crc kubenswrapper[4743]: I1125 16:24:01.157911 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zg2sz" event={"ID":"b8e2f79b-99c9-4038-b11e-edd5346d6e04","Type":"ContainerDied","Data":"40d10de5a37fea0e358f304e56e5f3c8169258c4bb631a40db401668105ecaec"} Nov 25 16:24:01 crc kubenswrapper[4743]: I1125 16:24:01.157940 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zg2sz" event={"ID":"b8e2f79b-99c9-4038-b11e-edd5346d6e04","Type":"ContainerStarted","Data":"a437a47a6a6b328a912ac33f23c7fbc3257925dff83b61afbf1b20c6364f26b9"} Nov 25 16:24:02 crc kubenswrapper[4743]: I1125 16:24:02.169561 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zg2sz" event={"ID":"b8e2f79b-99c9-4038-b11e-edd5346d6e04","Type":"ContainerStarted","Data":"6442b529240ad4b211af45e8bc039f809128d1b99b3bfc8937678d4aefca3b0f"} Nov 25 16:24:03 crc kubenswrapper[4743]: I1125 16:24:03.181969 4743 generic.go:334] "Generic (PLEG): container finished" podID="b8e2f79b-99c9-4038-b11e-edd5346d6e04" containerID="6442b529240ad4b211af45e8bc039f809128d1b99b3bfc8937678d4aefca3b0f" exitCode=0 Nov 25 16:24:03 crc kubenswrapper[4743]: I1125 16:24:03.182013 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zg2sz" event={"ID":"b8e2f79b-99c9-4038-b11e-edd5346d6e04","Type":"ContainerDied","Data":"6442b529240ad4b211af45e8bc039f809128d1b99b3bfc8937678d4aefca3b0f"} Nov 25 16:24:04 crc kubenswrapper[4743]: I1125 16:24:04.192783 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zg2sz" event={"ID":"b8e2f79b-99c9-4038-b11e-edd5346d6e04","Type":"ContainerStarted","Data":"41f9935719062ab432684e0fade8852a859b2f9c578daa243609a60c29ed6940"} Nov 25 16:24:04 crc kubenswrapper[4743]: I1125 16:24:04.216726 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zg2sz" podStartSLOduration=1.735651894 podStartE2EDuration="4.216705014s" podCreationTimestamp="2025-11-25 16:24:00 +0000 UTC" firstStartedPulling="2025-11-25 16:24:01.160199996 +0000 UTC m=+1520.282039545" lastFinishedPulling="2025-11-25 16:24:03.641253116 +0000 UTC m=+1522.763092665" observedRunningTime="2025-11-25 16:24:04.210670385 +0000 UTC m=+1523.332509954" watchObservedRunningTime="2025-11-25 16:24:04.216705014 +0000 UTC m=+1523.338544583" Nov 25 16:24:08 crc kubenswrapper[4743]: I1125 16:24:08.280830 4743 scope.go:117] "RemoveContainer" containerID="eaa4dc3e96f54f6a0e7f3832e5c71a8fde266fdf0189bdd829aa2cc54c5fabc4" Nov 25 16:24:08 crc kubenswrapper[4743]: I1125 16:24:08.301905 4743 scope.go:117] "RemoveContainer" containerID="cbed122c43dc1a205f1b74e157100f22782c4010ea152a4d0a3ce8ef12e9f336" Nov 25 16:24:08 crc kubenswrapper[4743]: I1125 16:24:08.327624 4743 scope.go:117] "RemoveContainer" containerID="fbd38eb8cbd146c06de2de7a954ed041d2de374f3f45e4783bbec0a43f15cc0f" Nov 25 16:24:08 crc kubenswrapper[4743]: I1125 16:24:08.366891 4743 scope.go:117] "RemoveContainer" containerID="6ec3aff340c586ec6b7971e16093e4df4cba00b97a603250194bbdb47ca6ace4" Nov 25 16:24:08 crc kubenswrapper[4743]: I1125 16:24:08.397177 4743 scope.go:117] "RemoveContainer" containerID="ea7b9050d28f76913ed959c917818132aef1c9c135892f603456594f85a036b1" Nov 25 16:24:08 crc kubenswrapper[4743]: I1125 16:24:08.434923 4743 scope.go:117] "RemoveContainer" containerID="7dd8c8a412d18cdf20ff28ebfe5b7e4f6ca76cb737624361b6f7bed227d68ccf" Nov 25 16:24:08 crc kubenswrapper[4743]: I1125 16:24:08.458473 4743 scope.go:117] "RemoveContainer" containerID="4394a5b6d239405f07da4e592c51984b5a9db79804bca4b927be7b1cf4ff194d" Nov 25 16:24:10 crc kubenswrapper[4743]: I1125 16:24:10.364687 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:10 crc kubenswrapper[4743]: I1125 16:24:10.365081 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:10 crc kubenswrapper[4743]: I1125 16:24:10.426345 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:11 crc kubenswrapper[4743]: I1125 16:24:11.300481 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:11 crc kubenswrapper[4743]: I1125 16:24:11.348036 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zg2sz"] Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.278015 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zg2sz" podUID="b8e2f79b-99c9-4038-b11e-edd5346d6e04" containerName="registry-server" containerID="cri-o://41f9935719062ab432684e0fade8852a859b2f9c578daa243609a60c29ed6940" gracePeriod=2 Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.676947 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k5bkd"] Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.679321 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.692937 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k5bkd"] Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.803897 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.823300 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzwcx\" (UniqueName: \"kubernetes.io/projected/cf2a305a-f814-486c-9657-817ed4761c29-kube-api-access-gzwcx\") pod \"redhat-operators-k5bkd\" (UID: \"cf2a305a-f814-486c-9657-817ed4761c29\") " pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.823410 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf2a305a-f814-486c-9657-817ed4761c29-utilities\") pod \"redhat-operators-k5bkd\" (UID: \"cf2a305a-f814-486c-9657-817ed4761c29\") " pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.823626 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf2a305a-f814-486c-9657-817ed4761c29-catalog-content\") pod \"redhat-operators-k5bkd\" (UID: \"cf2a305a-f814-486c-9657-817ed4761c29\") " pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.924890 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e2f79b-99c9-4038-b11e-edd5346d6e04-utilities\") pod \"b8e2f79b-99c9-4038-b11e-edd5346d6e04\" (UID: \"b8e2f79b-99c9-4038-b11e-edd5346d6e04\") " Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.925135 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e2f79b-99c9-4038-b11e-edd5346d6e04-catalog-content\") pod \"b8e2f79b-99c9-4038-b11e-edd5346d6e04\" (UID: \"b8e2f79b-99c9-4038-b11e-edd5346d6e04\") " Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.925174 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-862s4\" (UniqueName: \"kubernetes.io/projected/b8e2f79b-99c9-4038-b11e-edd5346d6e04-kube-api-access-862s4\") pod \"b8e2f79b-99c9-4038-b11e-edd5346d6e04\" (UID: \"b8e2f79b-99c9-4038-b11e-edd5346d6e04\") " Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.925509 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf2a305a-f814-486c-9657-817ed4761c29-catalog-content\") pod \"redhat-operators-k5bkd\" (UID: \"cf2a305a-f814-486c-9657-817ed4761c29\") " pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.925623 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzwcx\" (UniqueName: \"kubernetes.io/projected/cf2a305a-f814-486c-9657-817ed4761c29-kube-api-access-gzwcx\") pod \"redhat-operators-k5bkd\" (UID: \"cf2a305a-f814-486c-9657-817ed4761c29\") " pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.925649 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf2a305a-f814-486c-9657-817ed4761c29-utilities\") pod \"redhat-operators-k5bkd\" (UID: \"cf2a305a-f814-486c-9657-817ed4761c29\") " pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.925993 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8e2f79b-99c9-4038-b11e-edd5346d6e04-utilities" (OuterVolumeSpecName: "utilities") pod "b8e2f79b-99c9-4038-b11e-edd5346d6e04" (UID: "b8e2f79b-99c9-4038-b11e-edd5346d6e04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.926480 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf2a305a-f814-486c-9657-817ed4761c29-utilities\") pod \"redhat-operators-k5bkd\" (UID: \"cf2a305a-f814-486c-9657-817ed4761c29\") " pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.926827 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf2a305a-f814-486c-9657-817ed4761c29-catalog-content\") pod \"redhat-operators-k5bkd\" (UID: \"cf2a305a-f814-486c-9657-817ed4761c29\") " pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.931432 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8e2f79b-99c9-4038-b11e-edd5346d6e04-kube-api-access-862s4" (OuterVolumeSpecName: "kube-api-access-862s4") pod "b8e2f79b-99c9-4038-b11e-edd5346d6e04" (UID: "b8e2f79b-99c9-4038-b11e-edd5346d6e04"). InnerVolumeSpecName "kube-api-access-862s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.947775 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzwcx\" (UniqueName: \"kubernetes.io/projected/cf2a305a-f814-486c-9657-817ed4761c29-kube-api-access-gzwcx\") pod \"redhat-operators-k5bkd\" (UID: \"cf2a305a-f814-486c-9657-817ed4761c29\") " pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:13 crc kubenswrapper[4743]: I1125 16:24:13.976637 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8e2f79b-99c9-4038-b11e-edd5346d6e04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8e2f79b-99c9-4038-b11e-edd5346d6e04" (UID: "b8e2f79b-99c9-4038-b11e-edd5346d6e04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.027459 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8e2f79b-99c9-4038-b11e-edd5346d6e04-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.027497 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8e2f79b-99c9-4038-b11e-edd5346d6e04-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.027509 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-862s4\" (UniqueName: \"kubernetes.io/projected/b8e2f79b-99c9-4038-b11e-edd5346d6e04-kube-api-access-862s4\") on node \"crc\" DevicePath \"\"" Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.095542 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.293224 4743 generic.go:334] "Generic (PLEG): container finished" podID="b8e2f79b-99c9-4038-b11e-edd5346d6e04" containerID="41f9935719062ab432684e0fade8852a859b2f9c578daa243609a60c29ed6940" exitCode=0 Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.293294 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zg2sz" event={"ID":"b8e2f79b-99c9-4038-b11e-edd5346d6e04","Type":"ContainerDied","Data":"41f9935719062ab432684e0fade8852a859b2f9c578daa243609a60c29ed6940"} Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.293353 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zg2sz" Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.293390 4743 scope.go:117] "RemoveContainer" containerID="41f9935719062ab432684e0fade8852a859b2f9c578daa243609a60c29ed6940" Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.293375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zg2sz" event={"ID":"b8e2f79b-99c9-4038-b11e-edd5346d6e04","Type":"ContainerDied","Data":"a437a47a6a6b328a912ac33f23c7fbc3257925dff83b61afbf1b20c6364f26b9"} Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.334125 4743 scope.go:117] "RemoveContainer" containerID="6442b529240ad4b211af45e8bc039f809128d1b99b3bfc8937678d4aefca3b0f" Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.354888 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zg2sz"] Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.369049 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zg2sz"] Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.372186 4743 scope.go:117] "RemoveContainer" containerID="40d10de5a37fea0e358f304e56e5f3c8169258c4bb631a40db401668105ecaec" Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.396792 4743 scope.go:117] "RemoveContainer" containerID="41f9935719062ab432684e0fade8852a859b2f9c578daa243609a60c29ed6940" Nov 25 16:24:14 crc kubenswrapper[4743]: E1125 16:24:14.397334 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41f9935719062ab432684e0fade8852a859b2f9c578daa243609a60c29ed6940\": container with ID starting with 41f9935719062ab432684e0fade8852a859b2f9c578daa243609a60c29ed6940 not found: ID does not exist" containerID="41f9935719062ab432684e0fade8852a859b2f9c578daa243609a60c29ed6940" Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.397419 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f9935719062ab432684e0fade8852a859b2f9c578daa243609a60c29ed6940"} err="failed to get container status \"41f9935719062ab432684e0fade8852a859b2f9c578daa243609a60c29ed6940\": rpc error: code = NotFound desc = could not find container \"41f9935719062ab432684e0fade8852a859b2f9c578daa243609a60c29ed6940\": container with ID starting with 41f9935719062ab432684e0fade8852a859b2f9c578daa243609a60c29ed6940 not found: ID does not exist" Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.397465 4743 scope.go:117] "RemoveContainer" containerID="6442b529240ad4b211af45e8bc039f809128d1b99b3bfc8937678d4aefca3b0f" Nov 25 16:24:14 crc kubenswrapper[4743]: E1125 16:24:14.398613 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6442b529240ad4b211af45e8bc039f809128d1b99b3bfc8937678d4aefca3b0f\": container with ID starting with 6442b529240ad4b211af45e8bc039f809128d1b99b3bfc8937678d4aefca3b0f not found: ID does not exist" containerID="6442b529240ad4b211af45e8bc039f809128d1b99b3bfc8937678d4aefca3b0f" Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.398662 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6442b529240ad4b211af45e8bc039f809128d1b99b3bfc8937678d4aefca3b0f"} err="failed to get container status \"6442b529240ad4b211af45e8bc039f809128d1b99b3bfc8937678d4aefca3b0f\": rpc error: code = NotFound desc = could not find container \"6442b529240ad4b211af45e8bc039f809128d1b99b3bfc8937678d4aefca3b0f\": container with ID starting with 6442b529240ad4b211af45e8bc039f809128d1b99b3bfc8937678d4aefca3b0f not found: ID does not exist" Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.398700 4743 scope.go:117] "RemoveContainer" containerID="40d10de5a37fea0e358f304e56e5f3c8169258c4bb631a40db401668105ecaec" Nov 25 16:24:14 crc kubenswrapper[4743]: E1125 16:24:14.399283 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40d10de5a37fea0e358f304e56e5f3c8169258c4bb631a40db401668105ecaec\": container with ID starting with 40d10de5a37fea0e358f304e56e5f3c8169258c4bb631a40db401668105ecaec not found: ID does not exist" containerID="40d10de5a37fea0e358f304e56e5f3c8169258c4bb631a40db401668105ecaec" Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.399315 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d10de5a37fea0e358f304e56e5f3c8169258c4bb631a40db401668105ecaec"} err="failed to get container status \"40d10de5a37fea0e358f304e56e5f3c8169258c4bb631a40db401668105ecaec\": rpc error: code = NotFound desc = could not find container \"40d10de5a37fea0e358f304e56e5f3c8169258c4bb631a40db401668105ecaec\": container with ID starting with 40d10de5a37fea0e358f304e56e5f3c8169258c4bb631a40db401668105ecaec not found: ID does not exist" Nov 25 16:24:14 crc kubenswrapper[4743]: I1125 16:24:14.549194 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k5bkd"] Nov 25 16:24:15 crc kubenswrapper[4743]: I1125 16:24:15.308536 4743 generic.go:334] "Generic (PLEG): container finished" podID="cf2a305a-f814-486c-9657-817ed4761c29" containerID="a301541da7d7f0699bbbc2091acca8157198fd595614cb371e0fefcc884bc1aa" exitCode=0 Nov 25 16:24:15 crc kubenswrapper[4743]: I1125 16:24:15.308655 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5bkd" event={"ID":"cf2a305a-f814-486c-9657-817ed4761c29","Type":"ContainerDied","Data":"a301541da7d7f0699bbbc2091acca8157198fd595614cb371e0fefcc884bc1aa"} Nov 25 16:24:15 crc kubenswrapper[4743]: I1125 16:24:15.309072 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5bkd" event={"ID":"cf2a305a-f814-486c-9657-817ed4761c29","Type":"ContainerStarted","Data":"d40b0b6ce2d1c7a05d4deca57d56ecf8beaba167069bc0da23feba665c4da7c4"} Nov 25 16:24:15 crc kubenswrapper[4743]: I1125 16:24:15.311659 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 16:24:15 crc kubenswrapper[4743]: I1125 16:24:15.786042 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8e2f79b-99c9-4038-b11e-edd5346d6e04" path="/var/lib/kubelet/pods/b8e2f79b-99c9-4038-b11e-edd5346d6e04/volumes" Nov 25 16:24:17 crc kubenswrapper[4743]: I1125 16:24:17.351149 4743 generic.go:334] "Generic (PLEG): container finished" podID="cf2a305a-f814-486c-9657-817ed4761c29" containerID="e716b4a9ff040f05a2f51a23367c8313cd1478cd954f6d871ceb27561b82038e" exitCode=0 Nov 25 16:24:17 crc kubenswrapper[4743]: I1125 16:24:17.351594 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5bkd" event={"ID":"cf2a305a-f814-486c-9657-817ed4761c29","Type":"ContainerDied","Data":"e716b4a9ff040f05a2f51a23367c8313cd1478cd954f6d871ceb27561b82038e"} Nov 25 16:24:18 crc kubenswrapper[4743]: I1125 16:24:18.363224 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5bkd" event={"ID":"cf2a305a-f814-486c-9657-817ed4761c29","Type":"ContainerStarted","Data":"77fdbee299e3fa3c15b77086c7aa0c23170e7a063114a2a415df6e5b46979a40"} Nov 25 16:24:18 crc kubenswrapper[4743]: I1125 16:24:18.384234 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k5bkd" podStartSLOduration=2.688973458 podStartE2EDuration="5.384217241s" podCreationTimestamp="2025-11-25 16:24:13 +0000 UTC" firstStartedPulling="2025-11-25 16:24:15.311218304 +0000 UTC m=+1534.433057883" lastFinishedPulling="2025-11-25 16:24:18.006462117 +0000 UTC m=+1537.128301666" observedRunningTime="2025-11-25 16:24:18.382089655 +0000 UTC m=+1537.503929224" watchObservedRunningTime="2025-11-25 16:24:18.384217241 +0000 UTC m=+1537.506056790" Nov 25 16:24:20 crc kubenswrapper[4743]: I1125 16:24:20.076997 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:24:20 crc kubenswrapper[4743]: I1125 16:24:20.077349 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:24:24 crc kubenswrapper[4743]: I1125 16:24:24.096242 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:24 crc kubenswrapper[4743]: I1125 16:24:24.096939 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:24 crc kubenswrapper[4743]: I1125 16:24:24.145306 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:24 crc kubenswrapper[4743]: I1125 16:24:24.487867 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:24 crc kubenswrapper[4743]: I1125 16:24:24.535758 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k5bkd"] Nov 25 16:24:26 crc kubenswrapper[4743]: I1125 16:24:26.459043 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k5bkd" podUID="cf2a305a-f814-486c-9657-817ed4761c29" containerName="registry-server" containerID="cri-o://77fdbee299e3fa3c15b77086c7aa0c23170e7a063114a2a415df6e5b46979a40" gracePeriod=2 Nov 25 16:24:26 crc kubenswrapper[4743]: I1125 16:24:26.925021 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.074082 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzwcx\" (UniqueName: \"kubernetes.io/projected/cf2a305a-f814-486c-9657-817ed4761c29-kube-api-access-gzwcx\") pod \"cf2a305a-f814-486c-9657-817ed4761c29\" (UID: \"cf2a305a-f814-486c-9657-817ed4761c29\") " Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.074789 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf2a305a-f814-486c-9657-817ed4761c29-utilities\") pod \"cf2a305a-f814-486c-9657-817ed4761c29\" (UID: \"cf2a305a-f814-486c-9657-817ed4761c29\") " Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.074842 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf2a305a-f814-486c-9657-817ed4761c29-catalog-content\") pod \"cf2a305a-f814-486c-9657-817ed4761c29\" (UID: \"cf2a305a-f814-486c-9657-817ed4761c29\") " Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.075510 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf2a305a-f814-486c-9657-817ed4761c29-utilities" (OuterVolumeSpecName: "utilities") pod "cf2a305a-f814-486c-9657-817ed4761c29" (UID: "cf2a305a-f814-486c-9657-817ed4761c29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.080192 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf2a305a-f814-486c-9657-817ed4761c29-kube-api-access-gzwcx" (OuterVolumeSpecName: "kube-api-access-gzwcx") pod "cf2a305a-f814-486c-9657-817ed4761c29" (UID: "cf2a305a-f814-486c-9657-817ed4761c29"). InnerVolumeSpecName "kube-api-access-gzwcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.145122 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf2a305a-f814-486c-9657-817ed4761c29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf2a305a-f814-486c-9657-817ed4761c29" (UID: "cf2a305a-f814-486c-9657-817ed4761c29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.177436 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf2a305a-f814-486c-9657-817ed4761c29-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.177531 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf2a305a-f814-486c-9657-817ed4761c29-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.177550 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzwcx\" (UniqueName: \"kubernetes.io/projected/cf2a305a-f814-486c-9657-817ed4761c29-kube-api-access-gzwcx\") on node \"crc\" DevicePath \"\"" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.472810 4743 generic.go:334] "Generic (PLEG): container finished" podID="cf2a305a-f814-486c-9657-817ed4761c29" containerID="77fdbee299e3fa3c15b77086c7aa0c23170e7a063114a2a415df6e5b46979a40" exitCode=0 Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.472859 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5bkd" event={"ID":"cf2a305a-f814-486c-9657-817ed4761c29","Type":"ContainerDied","Data":"77fdbee299e3fa3c15b77086c7aa0c23170e7a063114a2a415df6e5b46979a40"} Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.472890 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5bkd" event={"ID":"cf2a305a-f814-486c-9657-817ed4761c29","Type":"ContainerDied","Data":"d40b0b6ce2d1c7a05d4deca57d56ecf8beaba167069bc0da23feba665c4da7c4"} Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.472894 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5bkd" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.472909 4743 scope.go:117] "RemoveContainer" containerID="77fdbee299e3fa3c15b77086c7aa0c23170e7a063114a2a415df6e5b46979a40" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.501944 4743 scope.go:117] "RemoveContainer" containerID="e716b4a9ff040f05a2f51a23367c8313cd1478cd954f6d871ceb27561b82038e" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.505547 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k5bkd"] Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.516757 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k5bkd"] Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.526480 4743 scope.go:117] "RemoveContainer" containerID="a301541da7d7f0699bbbc2091acca8157198fd595614cb371e0fefcc884bc1aa" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.563170 4743 scope.go:117] "RemoveContainer" containerID="77fdbee299e3fa3c15b77086c7aa0c23170e7a063114a2a415df6e5b46979a40" Nov 25 16:24:27 crc kubenswrapper[4743]: E1125 16:24:27.564319 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77fdbee299e3fa3c15b77086c7aa0c23170e7a063114a2a415df6e5b46979a40\": container with ID starting with 77fdbee299e3fa3c15b77086c7aa0c23170e7a063114a2a415df6e5b46979a40 not found: ID does not exist" containerID="77fdbee299e3fa3c15b77086c7aa0c23170e7a063114a2a415df6e5b46979a40" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.564350 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77fdbee299e3fa3c15b77086c7aa0c23170e7a063114a2a415df6e5b46979a40"} err="failed to get container status \"77fdbee299e3fa3c15b77086c7aa0c23170e7a063114a2a415df6e5b46979a40\": rpc error: code = NotFound desc = could not find container \"77fdbee299e3fa3c15b77086c7aa0c23170e7a063114a2a415df6e5b46979a40\": container with ID starting with 77fdbee299e3fa3c15b77086c7aa0c23170e7a063114a2a415df6e5b46979a40 not found: ID does not exist" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.564370 4743 scope.go:117] "RemoveContainer" containerID="e716b4a9ff040f05a2f51a23367c8313cd1478cd954f6d871ceb27561b82038e" Nov 25 16:24:27 crc kubenswrapper[4743]: E1125 16:24:27.564565 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e716b4a9ff040f05a2f51a23367c8313cd1478cd954f6d871ceb27561b82038e\": container with ID starting with e716b4a9ff040f05a2f51a23367c8313cd1478cd954f6d871ceb27561b82038e not found: ID does not exist" containerID="e716b4a9ff040f05a2f51a23367c8313cd1478cd954f6d871ceb27561b82038e" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.564643 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e716b4a9ff040f05a2f51a23367c8313cd1478cd954f6d871ceb27561b82038e"} err="failed to get container status \"e716b4a9ff040f05a2f51a23367c8313cd1478cd954f6d871ceb27561b82038e\": rpc error: code = NotFound desc = could not find container \"e716b4a9ff040f05a2f51a23367c8313cd1478cd954f6d871ceb27561b82038e\": container with ID starting with e716b4a9ff040f05a2f51a23367c8313cd1478cd954f6d871ceb27561b82038e not found: ID does not exist" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.564660 4743 scope.go:117] "RemoveContainer" containerID="a301541da7d7f0699bbbc2091acca8157198fd595614cb371e0fefcc884bc1aa" Nov 25 16:24:27 crc kubenswrapper[4743]: E1125 16:24:27.565039 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a301541da7d7f0699bbbc2091acca8157198fd595614cb371e0fefcc884bc1aa\": container with ID starting with a301541da7d7f0699bbbc2091acca8157198fd595614cb371e0fefcc884bc1aa not found: ID does not exist" containerID="a301541da7d7f0699bbbc2091acca8157198fd595614cb371e0fefcc884bc1aa" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.565087 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a301541da7d7f0699bbbc2091acca8157198fd595614cb371e0fefcc884bc1aa"} err="failed to get container status \"a301541da7d7f0699bbbc2091acca8157198fd595614cb371e0fefcc884bc1aa\": rpc error: code = NotFound desc = could not find container \"a301541da7d7f0699bbbc2091acca8157198fd595614cb371e0fefcc884bc1aa\": container with ID starting with a301541da7d7f0699bbbc2091acca8157198fd595614cb371e0fefcc884bc1aa not found: ID does not exist" Nov 25 16:24:27 crc kubenswrapper[4743]: E1125 16:24:27.669076 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf2a305a_f814_486c_9657_817ed4761c29.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf2a305a_f814_486c_9657_817ed4761c29.slice/crio-d40b0b6ce2d1c7a05d4deca57d56ecf8beaba167069bc0da23feba665c4da7c4\": RecentStats: unable to find data in memory cache]" Nov 25 16:24:27 crc kubenswrapper[4743]: I1125 16:24:27.786411 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf2a305a-f814-486c-9657-817ed4761c29" path="/var/lib/kubelet/pods/cf2a305a-f814-486c-9657-817ed4761c29/volumes" Nov 25 16:24:43 crc kubenswrapper[4743]: I1125 16:24:43.909912 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-njpm9"] Nov 25 16:24:43 crc kubenswrapper[4743]: E1125 16:24:43.910918 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e2f79b-99c9-4038-b11e-edd5346d6e04" containerName="extract-utilities" Nov 25 16:24:43 crc kubenswrapper[4743]: I1125 16:24:43.910932 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e2f79b-99c9-4038-b11e-edd5346d6e04" containerName="extract-utilities" Nov 25 16:24:43 crc kubenswrapper[4743]: E1125 16:24:43.910955 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e2f79b-99c9-4038-b11e-edd5346d6e04" containerName="registry-server" Nov 25 16:24:43 crc kubenswrapper[4743]: I1125 16:24:43.910961 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e2f79b-99c9-4038-b11e-edd5346d6e04" containerName="registry-server" Nov 25 16:24:43 crc kubenswrapper[4743]: E1125 16:24:43.910980 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2a305a-f814-486c-9657-817ed4761c29" containerName="registry-server" Nov 25 16:24:43 crc kubenswrapper[4743]: I1125 16:24:43.910986 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2a305a-f814-486c-9657-817ed4761c29" containerName="registry-server" Nov 25 16:24:43 crc kubenswrapper[4743]: E1125 16:24:43.911008 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2a305a-f814-486c-9657-817ed4761c29" containerName="extract-content" Nov 25 16:24:43 crc kubenswrapper[4743]: I1125 16:24:43.911014 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2a305a-f814-486c-9657-817ed4761c29" containerName="extract-content" Nov 25 16:24:43 crc kubenswrapper[4743]: E1125 16:24:43.911029 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8e2f79b-99c9-4038-b11e-edd5346d6e04" containerName="extract-content" Nov 25 16:24:43 crc kubenswrapper[4743]: I1125 16:24:43.911037 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e2f79b-99c9-4038-b11e-edd5346d6e04" containerName="extract-content" Nov 25 16:24:43 crc kubenswrapper[4743]: E1125 16:24:43.911046 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2a305a-f814-486c-9657-817ed4761c29" containerName="extract-utilities" Nov 25 16:24:43 crc kubenswrapper[4743]: I1125 16:24:43.911052 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2a305a-f814-486c-9657-817ed4761c29" containerName="extract-utilities" Nov 25 16:24:43 crc kubenswrapper[4743]: I1125 16:24:43.911238 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8e2f79b-99c9-4038-b11e-edd5346d6e04" containerName="registry-server" Nov 25 16:24:43 crc kubenswrapper[4743]: I1125 16:24:43.911260 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf2a305a-f814-486c-9657-817ed4761c29" containerName="registry-server" Nov 25 16:24:43 crc kubenswrapper[4743]: I1125 16:24:43.912654 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:43 crc kubenswrapper[4743]: I1125 16:24:43.925165 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-njpm9"] Nov 25 16:24:43 crc kubenswrapper[4743]: I1125 16:24:43.984085 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d59f\" (UniqueName: \"kubernetes.io/projected/2ea4677e-446d-48a1-8bf7-4affbd261da0-kube-api-access-9d59f\") pod \"redhat-marketplace-njpm9\" (UID: \"2ea4677e-446d-48a1-8bf7-4affbd261da0\") " pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:43 crc kubenswrapper[4743]: I1125 16:24:43.984419 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea4677e-446d-48a1-8bf7-4affbd261da0-catalog-content\") pod \"redhat-marketplace-njpm9\" (UID: \"2ea4677e-446d-48a1-8bf7-4affbd261da0\") " pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:43 crc kubenswrapper[4743]: I1125 16:24:43.984523 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea4677e-446d-48a1-8bf7-4affbd261da0-utilities\") pod \"redhat-marketplace-njpm9\" (UID: \"2ea4677e-446d-48a1-8bf7-4affbd261da0\") " pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:44 crc kubenswrapper[4743]: I1125 16:24:44.087359 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea4677e-446d-48a1-8bf7-4affbd261da0-catalog-content\") pod \"redhat-marketplace-njpm9\" (UID: \"2ea4677e-446d-48a1-8bf7-4affbd261da0\") " pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:44 crc kubenswrapper[4743]: I1125 16:24:44.087434 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea4677e-446d-48a1-8bf7-4affbd261da0-utilities\") pod \"redhat-marketplace-njpm9\" (UID: \"2ea4677e-446d-48a1-8bf7-4affbd261da0\") " pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:44 crc kubenswrapper[4743]: I1125 16:24:44.087470 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d59f\" (UniqueName: \"kubernetes.io/projected/2ea4677e-446d-48a1-8bf7-4affbd261da0-kube-api-access-9d59f\") pod \"redhat-marketplace-njpm9\" (UID: \"2ea4677e-446d-48a1-8bf7-4affbd261da0\") " pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:44 crc kubenswrapper[4743]: I1125 16:24:44.088260 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea4677e-446d-48a1-8bf7-4affbd261da0-catalog-content\") pod \"redhat-marketplace-njpm9\" (UID: \"2ea4677e-446d-48a1-8bf7-4affbd261da0\") " pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:44 crc kubenswrapper[4743]: I1125 16:24:44.088533 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea4677e-446d-48a1-8bf7-4affbd261da0-utilities\") pod \"redhat-marketplace-njpm9\" (UID: \"2ea4677e-446d-48a1-8bf7-4affbd261da0\") " pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:44 crc kubenswrapper[4743]: I1125 16:24:44.105506 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d59f\" (UniqueName: \"kubernetes.io/projected/2ea4677e-446d-48a1-8bf7-4affbd261da0-kube-api-access-9d59f\") pod \"redhat-marketplace-njpm9\" (UID: \"2ea4677e-446d-48a1-8bf7-4affbd261da0\") " pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:44 crc kubenswrapper[4743]: I1125 16:24:44.231191 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:44 crc kubenswrapper[4743]: I1125 16:24:44.706494 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-njpm9"] Nov 25 16:24:45 crc kubenswrapper[4743]: I1125 16:24:45.628797 4743 generic.go:334] "Generic (PLEG): container finished" podID="2ea4677e-446d-48a1-8bf7-4affbd261da0" containerID="6b31c4905647e29405cc696daf87b452f135d1778db0617cef609ad4b904b88d" exitCode=0 Nov 25 16:24:45 crc kubenswrapper[4743]: I1125 16:24:45.628883 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njpm9" event={"ID":"2ea4677e-446d-48a1-8bf7-4affbd261da0","Type":"ContainerDied","Data":"6b31c4905647e29405cc696daf87b452f135d1778db0617cef609ad4b904b88d"} Nov 25 16:24:45 crc kubenswrapper[4743]: I1125 16:24:45.629126 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njpm9" event={"ID":"2ea4677e-446d-48a1-8bf7-4affbd261da0","Type":"ContainerStarted","Data":"7fc64355dbd97cab31d05be3ef46d24b085e88c5216e1add5ed0816148aaede6"} Nov 25 16:24:47 crc kubenswrapper[4743]: I1125 16:24:47.647470 4743 generic.go:334] "Generic (PLEG): container finished" podID="2ea4677e-446d-48a1-8bf7-4affbd261da0" containerID="85fc9bb215bb783b99b32f94b2d75788da2b907465c9c865a0a527ca3074135c" exitCode=0 Nov 25 16:24:47 crc kubenswrapper[4743]: I1125 16:24:47.647529 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njpm9" event={"ID":"2ea4677e-446d-48a1-8bf7-4affbd261da0","Type":"ContainerDied","Data":"85fc9bb215bb783b99b32f94b2d75788da2b907465c9c865a0a527ca3074135c"} Nov 25 16:24:48 crc kubenswrapper[4743]: I1125 16:24:48.659748 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njpm9" event={"ID":"2ea4677e-446d-48a1-8bf7-4affbd261da0","Type":"ContainerStarted","Data":"8efcfd500524eae94ec876eff99e1f37c4edd8dbd31e7298b494b36fe2146c4e"} Nov 25 16:24:48 crc kubenswrapper[4743]: I1125 16:24:48.688940 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-njpm9" podStartSLOduration=3.25450853 podStartE2EDuration="5.688923052s" podCreationTimestamp="2025-11-25 16:24:43 +0000 UTC" firstStartedPulling="2025-11-25 16:24:45.630699579 +0000 UTC m=+1564.752539128" lastFinishedPulling="2025-11-25 16:24:48.065114101 +0000 UTC m=+1567.186953650" observedRunningTime="2025-11-25 16:24:48.678793254 +0000 UTC m=+1567.800632823" watchObservedRunningTime="2025-11-25 16:24:48.688923052 +0000 UTC m=+1567.810762601" Nov 25 16:24:50 crc kubenswrapper[4743]: I1125 16:24:50.077113 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:24:50 crc kubenswrapper[4743]: I1125 16:24:50.077214 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:24:50 crc kubenswrapper[4743]: I1125 16:24:50.077315 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 16:24:50 crc kubenswrapper[4743]: I1125 16:24:50.078758 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:24:50 crc kubenswrapper[4743]: I1125 16:24:50.078852 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" gracePeriod=600 Nov 25 16:24:50 crc kubenswrapper[4743]: I1125 16:24:50.722322 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" exitCode=0 Nov 25 16:24:50 crc kubenswrapper[4743]: I1125 16:24:50.722715 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806"} Nov 25 16:24:50 crc kubenswrapper[4743]: I1125 16:24:50.722780 4743 scope.go:117] "RemoveContainer" containerID="c7a1e69a5b625582ce315759353f5fdddaa5af76ffaa857f59e8fe101fdc1d28" Nov 25 16:24:50 crc kubenswrapper[4743]: E1125 16:24:50.724415 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:24:51 crc kubenswrapper[4743]: I1125 16:24:51.731496 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:24:51 crc kubenswrapper[4743]: E1125 16:24:51.731761 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:24:54 crc kubenswrapper[4743]: I1125 16:24:54.231933 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:54 crc kubenswrapper[4743]: I1125 16:24:54.232330 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:54 crc kubenswrapper[4743]: I1125 16:24:54.285251 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:54 crc kubenswrapper[4743]: I1125 16:24:54.803546 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:54 crc kubenswrapper[4743]: I1125 16:24:54.855865 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-njpm9"] Nov 25 16:24:56 crc kubenswrapper[4743]: I1125 16:24:56.779209 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-njpm9" podUID="2ea4677e-446d-48a1-8bf7-4affbd261da0" containerName="registry-server" containerID="cri-o://8efcfd500524eae94ec876eff99e1f37c4edd8dbd31e7298b494b36fe2146c4e" gracePeriod=2 Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.239797 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.339290 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d59f\" (UniqueName: \"kubernetes.io/projected/2ea4677e-446d-48a1-8bf7-4affbd261da0-kube-api-access-9d59f\") pod \"2ea4677e-446d-48a1-8bf7-4affbd261da0\" (UID: \"2ea4677e-446d-48a1-8bf7-4affbd261da0\") " Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.339499 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea4677e-446d-48a1-8bf7-4affbd261da0-catalog-content\") pod \"2ea4677e-446d-48a1-8bf7-4affbd261da0\" (UID: \"2ea4677e-446d-48a1-8bf7-4affbd261da0\") " Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.339643 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea4677e-446d-48a1-8bf7-4affbd261da0-utilities\") pod \"2ea4677e-446d-48a1-8bf7-4affbd261da0\" (UID: \"2ea4677e-446d-48a1-8bf7-4affbd261da0\") " Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.340832 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea4677e-446d-48a1-8bf7-4affbd261da0-utilities" (OuterVolumeSpecName: "utilities") pod "2ea4677e-446d-48a1-8bf7-4affbd261da0" (UID: "2ea4677e-446d-48a1-8bf7-4affbd261da0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.346509 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea4677e-446d-48a1-8bf7-4affbd261da0-kube-api-access-9d59f" (OuterVolumeSpecName: "kube-api-access-9d59f") pod "2ea4677e-446d-48a1-8bf7-4affbd261da0" (UID: "2ea4677e-446d-48a1-8bf7-4affbd261da0"). InnerVolumeSpecName "kube-api-access-9d59f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.359511 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea4677e-446d-48a1-8bf7-4affbd261da0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ea4677e-446d-48a1-8bf7-4affbd261da0" (UID: "2ea4677e-446d-48a1-8bf7-4affbd261da0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.442465 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea4677e-446d-48a1-8bf7-4affbd261da0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.442517 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea4677e-446d-48a1-8bf7-4affbd261da0-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.442532 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d59f\" (UniqueName: \"kubernetes.io/projected/2ea4677e-446d-48a1-8bf7-4affbd261da0-kube-api-access-9d59f\") on node \"crc\" DevicePath \"\"" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.790664 4743 generic.go:334] "Generic (PLEG): container finished" podID="2ea4677e-446d-48a1-8bf7-4affbd261da0" containerID="8efcfd500524eae94ec876eff99e1f37c4edd8dbd31e7298b494b36fe2146c4e" exitCode=0 Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.790724 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njpm9" event={"ID":"2ea4677e-446d-48a1-8bf7-4affbd261da0","Type":"ContainerDied","Data":"8efcfd500524eae94ec876eff99e1f37c4edd8dbd31e7298b494b36fe2146c4e"} Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.790760 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njpm9" event={"ID":"2ea4677e-446d-48a1-8bf7-4affbd261da0","Type":"ContainerDied","Data":"7fc64355dbd97cab31d05be3ef46d24b085e88c5216e1add5ed0816148aaede6"} Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.790781 4743 scope.go:117] "RemoveContainer" containerID="8efcfd500524eae94ec876eff99e1f37c4edd8dbd31e7298b494b36fe2146c4e" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.790933 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-njpm9" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.817249 4743 scope.go:117] "RemoveContainer" containerID="85fc9bb215bb783b99b32f94b2d75788da2b907465c9c865a0a527ca3074135c" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.831622 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-njpm9"] Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.844376 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-njpm9"] Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.852909 4743 scope.go:117] "RemoveContainer" containerID="6b31c4905647e29405cc696daf87b452f135d1778db0617cef609ad4b904b88d" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.896501 4743 scope.go:117] "RemoveContainer" containerID="8efcfd500524eae94ec876eff99e1f37c4edd8dbd31e7298b494b36fe2146c4e" Nov 25 16:24:57 crc kubenswrapper[4743]: E1125 16:24:57.897120 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8efcfd500524eae94ec876eff99e1f37c4edd8dbd31e7298b494b36fe2146c4e\": container with ID starting with 8efcfd500524eae94ec876eff99e1f37c4edd8dbd31e7298b494b36fe2146c4e not found: ID does not exist" containerID="8efcfd500524eae94ec876eff99e1f37c4edd8dbd31e7298b494b36fe2146c4e" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.897175 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8efcfd500524eae94ec876eff99e1f37c4edd8dbd31e7298b494b36fe2146c4e"} err="failed to get container status \"8efcfd500524eae94ec876eff99e1f37c4edd8dbd31e7298b494b36fe2146c4e\": rpc error: code = NotFound desc = could not find container \"8efcfd500524eae94ec876eff99e1f37c4edd8dbd31e7298b494b36fe2146c4e\": container with ID starting with 8efcfd500524eae94ec876eff99e1f37c4edd8dbd31e7298b494b36fe2146c4e not found: ID does not exist" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.897208 4743 scope.go:117] "RemoveContainer" containerID="85fc9bb215bb783b99b32f94b2d75788da2b907465c9c865a0a527ca3074135c" Nov 25 16:24:57 crc kubenswrapper[4743]: E1125 16:24:57.897634 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85fc9bb215bb783b99b32f94b2d75788da2b907465c9c865a0a527ca3074135c\": container with ID starting with 85fc9bb215bb783b99b32f94b2d75788da2b907465c9c865a0a527ca3074135c not found: ID does not exist" containerID="85fc9bb215bb783b99b32f94b2d75788da2b907465c9c865a0a527ca3074135c" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.897674 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85fc9bb215bb783b99b32f94b2d75788da2b907465c9c865a0a527ca3074135c"} err="failed to get container status \"85fc9bb215bb783b99b32f94b2d75788da2b907465c9c865a0a527ca3074135c\": rpc error: code = NotFound desc = could not find container \"85fc9bb215bb783b99b32f94b2d75788da2b907465c9c865a0a527ca3074135c\": container with ID starting with 85fc9bb215bb783b99b32f94b2d75788da2b907465c9c865a0a527ca3074135c not found: ID does not exist" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.897705 4743 scope.go:117] "RemoveContainer" containerID="6b31c4905647e29405cc696daf87b452f135d1778db0617cef609ad4b904b88d" Nov 25 16:24:57 crc kubenswrapper[4743]: E1125 16:24:57.898181 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b31c4905647e29405cc696daf87b452f135d1778db0617cef609ad4b904b88d\": container with ID starting with 6b31c4905647e29405cc696daf87b452f135d1778db0617cef609ad4b904b88d not found: ID does not exist" containerID="6b31c4905647e29405cc696daf87b452f135d1778db0617cef609ad4b904b88d" Nov 25 16:24:57 crc kubenswrapper[4743]: I1125 16:24:57.898226 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b31c4905647e29405cc696daf87b452f135d1778db0617cef609ad4b904b88d"} err="failed to get container status \"6b31c4905647e29405cc696daf87b452f135d1778db0617cef609ad4b904b88d\": rpc error: code = NotFound desc = could not find container \"6b31c4905647e29405cc696daf87b452f135d1778db0617cef609ad4b904b88d\": container with ID starting with 6b31c4905647e29405cc696daf87b452f135d1778db0617cef609ad4b904b88d not found: ID does not exist" Nov 25 16:24:59 crc kubenswrapper[4743]: I1125 16:24:59.785997 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea4677e-446d-48a1-8bf7-4affbd261da0" path="/var/lib/kubelet/pods/2ea4677e-446d-48a1-8bf7-4affbd261da0/volumes" Nov 25 16:25:01 crc kubenswrapper[4743]: I1125 16:25:01.037860 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-2wn4j"] Nov 25 16:25:01 crc kubenswrapper[4743]: I1125 16:25:01.048385 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-2wn4j"] Nov 25 16:25:01 crc kubenswrapper[4743]: I1125 16:25:01.790108 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ceca1a3-ffa3-4b15-bca5-c306720941f3" path="/var/lib/kubelet/pods/1ceca1a3-ffa3-4b15-bca5-c306720941f3/volumes" Nov 25 16:25:02 crc kubenswrapper[4743]: I1125 16:25:02.026901 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-f4d9-account-create-pmmxn"] Nov 25 16:25:02 crc kubenswrapper[4743]: I1125 16:25:02.038465 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-f4d9-account-create-pmmxn"] Nov 25 16:25:03 crc kubenswrapper[4743]: I1125 16:25:03.030374 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-0782-account-create-k5q7l"] Nov 25 16:25:03 crc kubenswrapper[4743]: I1125 16:25:03.041332 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-0782-account-create-k5q7l"] Nov 25 16:25:03 crc kubenswrapper[4743]: I1125 16:25:03.054134 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-aa4a-account-create-mlttg"] Nov 25 16:25:03 crc kubenswrapper[4743]: I1125 16:25:03.064095 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-jtddz"] Nov 25 16:25:03 crc kubenswrapper[4743]: I1125 16:25:03.073772 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-2j7qx"] Nov 25 16:25:03 crc kubenswrapper[4743]: I1125 16:25:03.082103 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-jtddz"] Nov 25 16:25:03 crc kubenswrapper[4743]: I1125 16:25:03.089697 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-aa4a-account-create-mlttg"] Nov 25 16:25:03 crc kubenswrapper[4743]: I1125 16:25:03.103634 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-2j7qx"] Nov 25 16:25:03 crc kubenswrapper[4743]: I1125 16:25:03.788494 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a21ef5d-71ae-47ce-aa24-d49830887317" path="/var/lib/kubelet/pods/0a21ef5d-71ae-47ce-aa24-d49830887317/volumes" Nov 25 16:25:03 crc kubenswrapper[4743]: I1125 16:25:03.789297 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="436fe732-731a-4cb7-85d6-2d4e3ef48805" path="/var/lib/kubelet/pods/436fe732-731a-4cb7-85d6-2d4e3ef48805/volumes" Nov 25 16:25:03 crc kubenswrapper[4743]: I1125 16:25:03.790062 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5903c1f3-91d8-4b0a-9680-d0f547df08c2" path="/var/lib/kubelet/pods/5903c1f3-91d8-4b0a-9680-d0f547df08c2/volumes" Nov 25 16:25:03 crc kubenswrapper[4743]: I1125 16:25:03.790896 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ae1e82-cea4-4ab3-9cc7-7b2615288871" path="/var/lib/kubelet/pods/f1ae1e82-cea4-4ab3-9cc7-7b2615288871/volumes" Nov 25 16:25:03 crc kubenswrapper[4743]: I1125 16:25:03.792333 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f" path="/var/lib/kubelet/pods/f32d225e-47dc-4e1f-9fd9-34b8d4c15f5f/volumes" Nov 25 16:25:06 crc kubenswrapper[4743]: I1125 16:25:06.775720 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:25:06 crc kubenswrapper[4743]: E1125 16:25:06.776537 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:25:08 crc kubenswrapper[4743]: I1125 16:25:08.542891 4743 scope.go:117] "RemoveContainer" containerID="e3bccbc836f8f554ab576d3a944443744189ea4a19278f3f39b19238bed50ff1" Nov 25 16:25:08 crc kubenswrapper[4743]: I1125 16:25:08.579226 4743 scope.go:117] "RemoveContainer" containerID="45f5ef28aebece070382f1342dc3d4c003a2f4b5752bd7119ec51a0625709e56" Nov 25 16:25:08 crc kubenswrapper[4743]: I1125 16:25:08.610242 4743 scope.go:117] "RemoveContainer" containerID="5d7ec6230ac2f515462dce23f77de838c2c0e0ebdc003696878e7871d27bcb44" Nov 25 16:25:08 crc kubenswrapper[4743]: I1125 16:25:08.668424 4743 scope.go:117] "RemoveContainer" containerID="b7ba280e66c50c1a41c253255e88bb03f1095c7a5e4cf0efef77357ee3db1bce" Nov 25 16:25:08 crc kubenswrapper[4743]: I1125 16:25:08.736178 4743 scope.go:117] "RemoveContainer" containerID="3911a14700937f91794271ed381f84a89e399c3898b17ee85e44c780807dbf2f" Nov 25 16:25:08 crc kubenswrapper[4743]: I1125 16:25:08.785117 4743 scope.go:117] "RemoveContainer" containerID="27df768c416be8f775a14d1b8637d6485231ade77f7b2d1674ce086c86fd22be" Nov 25 16:25:08 crc kubenswrapper[4743]: I1125 16:25:08.842135 4743 scope.go:117] "RemoveContainer" containerID="9590cbdbe696c1fc0327f0b4b653113088098e0bd7decd969a63930ee20f2078" Nov 25 16:25:08 crc kubenswrapper[4743]: I1125 16:25:08.884844 4743 scope.go:117] "RemoveContainer" containerID="b4044e626635467d25ee0ce321b3dc9d51cd2b8647513b802b8a5b2ba62fdfcf" Nov 25 16:25:17 crc kubenswrapper[4743]: I1125 16:25:17.983296 4743 generic.go:334] "Generic (PLEG): container finished" podID="14bc3c31-f23e-4c67-a989-e85613bd5607" containerID="789c3db1cc79db74b88f6c1ac0c172cdc776cd75634f6b75119abf8658c2fc00" exitCode=0 Nov 25 16:25:17 crc kubenswrapper[4743]: I1125 16:25:17.983389 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" event={"ID":"14bc3c31-f23e-4c67-a989-e85613bd5607","Type":"ContainerDied","Data":"789c3db1cc79db74b88f6c1ac0c172cdc776cd75634f6b75119abf8658c2fc00"} Nov 25 16:25:19 crc kubenswrapper[4743]: I1125 16:25:19.378482 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:25:19 crc kubenswrapper[4743]: I1125 16:25:19.467126 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-inventory\") pod \"14bc3c31-f23e-4c67-a989-e85613bd5607\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " Nov 25 16:25:19 crc kubenswrapper[4743]: I1125 16:25:19.467565 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzbg8\" (UniqueName: \"kubernetes.io/projected/14bc3c31-f23e-4c67-a989-e85613bd5607-kube-api-access-wzbg8\") pod \"14bc3c31-f23e-4c67-a989-e85613bd5607\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " Nov 25 16:25:19 crc kubenswrapper[4743]: I1125 16:25:19.467703 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-bootstrap-combined-ca-bundle\") pod \"14bc3c31-f23e-4c67-a989-e85613bd5607\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " Nov 25 16:25:19 crc kubenswrapper[4743]: I1125 16:25:19.467787 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-ssh-key\") pod \"14bc3c31-f23e-4c67-a989-e85613bd5607\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " Nov 25 16:25:19 crc kubenswrapper[4743]: I1125 16:25:19.473024 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "14bc3c31-f23e-4c67-a989-e85613bd5607" (UID: "14bc3c31-f23e-4c67-a989-e85613bd5607"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:25:19 crc kubenswrapper[4743]: I1125 16:25:19.473215 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14bc3c31-f23e-4c67-a989-e85613bd5607-kube-api-access-wzbg8" (OuterVolumeSpecName: "kube-api-access-wzbg8") pod "14bc3c31-f23e-4c67-a989-e85613bd5607" (UID: "14bc3c31-f23e-4c67-a989-e85613bd5607"). InnerVolumeSpecName "kube-api-access-wzbg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:25:19 crc kubenswrapper[4743]: E1125 16:25:19.490994 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-ssh-key podName:14bc3c31-f23e-4c67-a989-e85613bd5607 nodeName:}" failed. No retries permitted until 2025-11-25 16:25:19.990961332 +0000 UTC m=+1599.112800881 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-ssh-key") pod "14bc3c31-f23e-4c67-a989-e85613bd5607" (UID: "14bc3c31-f23e-4c67-a989-e85613bd5607") : error deleting /var/lib/kubelet/pods/14bc3c31-f23e-4c67-a989-e85613bd5607/volume-subpaths: remove /var/lib/kubelet/pods/14bc3c31-f23e-4c67-a989-e85613bd5607/volume-subpaths: no such file or directory Nov 25 16:25:19 crc kubenswrapper[4743]: I1125 16:25:19.493669 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-inventory" (OuterVolumeSpecName: "inventory") pod "14bc3c31-f23e-4c67-a989-e85613bd5607" (UID: "14bc3c31-f23e-4c67-a989-e85613bd5607"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:25:19 crc kubenswrapper[4743]: I1125 16:25:19.570504 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:25:19 crc kubenswrapper[4743]: I1125 16:25:19.570548 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzbg8\" (UniqueName: \"kubernetes.io/projected/14bc3c31-f23e-4c67-a989-e85613bd5607-kube-api-access-wzbg8\") on node \"crc\" DevicePath \"\"" Nov 25 16:25:19 crc kubenswrapper[4743]: I1125 16:25:19.570559 4743 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.002406 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" event={"ID":"14bc3c31-f23e-4c67-a989-e85613bd5607","Type":"ContainerDied","Data":"f5d822f5901b25ae2b3c0ea06c600d1516acfa82c8f8e0566f531243102c1703"} Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.002446 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5d822f5901b25ae2b3c0ea06c600d1516acfa82c8f8e0566f531243102c1703" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.002466 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.073694 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt"] Nov 25 16:25:20 crc kubenswrapper[4743]: E1125 16:25:20.074195 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea4677e-446d-48a1-8bf7-4affbd261da0" containerName="registry-server" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.074215 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea4677e-446d-48a1-8bf7-4affbd261da0" containerName="registry-server" Nov 25 16:25:20 crc kubenswrapper[4743]: E1125 16:25:20.074231 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14bc3c31-f23e-4c67-a989-e85613bd5607" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.074242 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="14bc3c31-f23e-4c67-a989-e85613bd5607" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 25 16:25:20 crc kubenswrapper[4743]: E1125 16:25:20.074259 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea4677e-446d-48a1-8bf7-4affbd261da0" containerName="extract-content" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.074267 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea4677e-446d-48a1-8bf7-4affbd261da0" containerName="extract-content" Nov 25 16:25:20 crc kubenswrapper[4743]: E1125 16:25:20.074285 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea4677e-446d-48a1-8bf7-4affbd261da0" containerName="extract-utilities" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.074291 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea4677e-446d-48a1-8bf7-4affbd261da0" containerName="extract-utilities" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.074497 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="14bc3c31-f23e-4c67-a989-e85613bd5607" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.074517 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea4677e-446d-48a1-8bf7-4affbd261da0" containerName="registry-server" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.075312 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.078541 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-ssh-key\") pod \"14bc3c31-f23e-4c67-a989-e85613bd5607\" (UID: \"14bc3c31-f23e-4c67-a989-e85613bd5607\") " Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.081321 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "14bc3c31-f23e-4c67-a989-e85613bd5607" (UID: "14bc3c31-f23e-4c67-a989-e85613bd5607"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.084320 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt"] Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.181345 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk8tc\" (UniqueName: \"kubernetes.io/projected/370e248d-8977-4a95-ac29-df64918b694b-kube-api-access-bk8tc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt\" (UID: \"370e248d-8977-4a95-ac29-df64918b694b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.181647 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/370e248d-8977-4a95-ac29-df64918b694b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt\" (UID: \"370e248d-8977-4a95-ac29-df64918b694b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.181774 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/370e248d-8977-4a95-ac29-df64918b694b-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt\" (UID: \"370e248d-8977-4a95-ac29-df64918b694b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.182041 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14bc3c31-f23e-4c67-a989-e85613bd5607-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.283412 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/370e248d-8977-4a95-ac29-df64918b694b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt\" (UID: \"370e248d-8977-4a95-ac29-df64918b694b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.283763 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/370e248d-8977-4a95-ac29-df64918b694b-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt\" (UID: \"370e248d-8977-4a95-ac29-df64918b694b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.284055 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk8tc\" (UniqueName: \"kubernetes.io/projected/370e248d-8977-4a95-ac29-df64918b694b-kube-api-access-bk8tc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt\" (UID: \"370e248d-8977-4a95-ac29-df64918b694b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.287065 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/370e248d-8977-4a95-ac29-df64918b694b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt\" (UID: \"370e248d-8977-4a95-ac29-df64918b694b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.288172 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/370e248d-8977-4a95-ac29-df64918b694b-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt\" (UID: \"370e248d-8977-4a95-ac29-df64918b694b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.303323 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk8tc\" (UniqueName: \"kubernetes.io/projected/370e248d-8977-4a95-ac29-df64918b694b-kube-api-access-bk8tc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt\" (UID: \"370e248d-8977-4a95-ac29-df64918b694b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.446243 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.775350 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:25:20 crc kubenswrapper[4743]: E1125 16:25:20.775997 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:25:20 crc kubenswrapper[4743]: I1125 16:25:20.984991 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt"] Nov 25 16:25:21 crc kubenswrapper[4743]: I1125 16:25:21.012540 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" event={"ID":"370e248d-8977-4a95-ac29-df64918b694b","Type":"ContainerStarted","Data":"6a7c75f77872b9b0da563bb16cd905336c68202847c682eff170553da15d27a3"} Nov 25 16:25:23 crc kubenswrapper[4743]: I1125 16:25:23.029490 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" event={"ID":"370e248d-8977-4a95-ac29-df64918b694b","Type":"ContainerStarted","Data":"0e8cbe87243dafc5ec6b1a4f10797a70e0846ad456e55b0577a441c646a599a5"} Nov 25 16:25:23 crc kubenswrapper[4743]: I1125 16:25:23.048773 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" podStartSLOduration=1.995193101 podStartE2EDuration="3.048754184s" podCreationTimestamp="2025-11-25 16:25:20 +0000 UTC" firstStartedPulling="2025-11-25 16:25:20.989876787 +0000 UTC m=+1600.111716326" lastFinishedPulling="2025-11-25 16:25:22.04343786 +0000 UTC m=+1601.165277409" observedRunningTime="2025-11-25 16:25:23.041772764 +0000 UTC m=+1602.163612313" watchObservedRunningTime="2025-11-25 16:25:23.048754184 +0000 UTC m=+1602.170593733" Nov 25 16:25:24 crc kubenswrapper[4743]: I1125 16:25:24.041567 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6edd-account-create-4mchv"] Nov 25 16:25:24 crc kubenswrapper[4743]: I1125 16:25:24.052685 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-scfvn"] Nov 25 16:25:24 crc kubenswrapper[4743]: I1125 16:25:24.064330 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6edd-account-create-4mchv"] Nov 25 16:25:24 crc kubenswrapper[4743]: I1125 16:25:24.074025 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-scfvn"] Nov 25 16:25:25 crc kubenswrapper[4743]: I1125 16:25:25.809179 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f77a2c7-d59b-43a4-9ffb-4edd475a57eb" path="/var/lib/kubelet/pods/9f77a2c7-d59b-43a4-9ffb-4edd475a57eb/volumes" Nov 25 16:25:25 crc kubenswrapper[4743]: I1125 16:25:25.810168 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb815cd5-463d-4056-b4af-e9c6cf18a9c7" path="/var/lib/kubelet/pods/bb815cd5-463d-4056-b4af-e9c6cf18a9c7/volumes" Nov 25 16:25:27 crc kubenswrapper[4743]: I1125 16:25:27.039733 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-rjl8d"] Nov 25 16:25:27 crc kubenswrapper[4743]: I1125 16:25:27.056622 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-hkb7f"] Nov 25 16:25:27 crc kubenswrapper[4743]: I1125 16:25:27.066763 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-hkb7f"] Nov 25 16:25:27 crc kubenswrapper[4743]: I1125 16:25:27.077771 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-27cb-account-create-d8dqk"] Nov 25 16:25:27 crc kubenswrapper[4743]: I1125 16:25:27.087252 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-rjl8d"] Nov 25 16:25:27 crc kubenswrapper[4743]: I1125 16:25:27.094955 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f84a-account-create-9rs6f"] Nov 25 16:25:27 crc kubenswrapper[4743]: I1125 16:25:27.103548 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-27cb-account-create-d8dqk"] Nov 25 16:25:27 crc kubenswrapper[4743]: I1125 16:25:27.114264 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f84a-account-create-9rs6f"] Nov 25 16:25:27 crc kubenswrapper[4743]: I1125 16:25:27.788293 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="197ff5f2-2482-4735-8a0d-8b77ed613724" path="/var/lib/kubelet/pods/197ff5f2-2482-4735-8a0d-8b77ed613724/volumes" Nov 25 16:25:27 crc kubenswrapper[4743]: I1125 16:25:27.788921 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d466b91c-227b-424c-a235-7ef50de97f94" path="/var/lib/kubelet/pods/d466b91c-227b-424c-a235-7ef50de97f94/volumes" Nov 25 16:25:27 crc kubenswrapper[4743]: I1125 16:25:27.789447 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4715ed9-7cbe-4519-84fd-db690a43a69f" path="/var/lib/kubelet/pods/d4715ed9-7cbe-4519-84fd-db690a43a69f/volumes" Nov 25 16:25:27 crc kubenswrapper[4743]: I1125 16:25:27.790016 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dad1f368-dada-4a6b-8060-9aed7b85a828" path="/var/lib/kubelet/pods/dad1f368-dada-4a6b-8060-9aed7b85a828/volumes" Nov 25 16:25:34 crc kubenswrapper[4743]: I1125 16:25:34.031650 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-cfq9v"] Nov 25 16:25:34 crc kubenswrapper[4743]: I1125 16:25:34.055282 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-cfq9v"] Nov 25 16:25:35 crc kubenswrapper[4743]: I1125 16:25:35.031665 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-gcflw"] Nov 25 16:25:35 crc kubenswrapper[4743]: I1125 16:25:35.041166 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-gcflw"] Nov 25 16:25:35 crc kubenswrapper[4743]: I1125 16:25:35.775825 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:25:35 crc kubenswrapper[4743]: E1125 16:25:35.776372 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:25:35 crc kubenswrapper[4743]: I1125 16:25:35.787355 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="282877d5-d175-44e4-a9da-3253dd8c4d95" path="/var/lib/kubelet/pods/282877d5-d175-44e4-a9da-3253dd8c4d95/volumes" Nov 25 16:25:35 crc kubenswrapper[4743]: I1125 16:25:35.788263 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="548ce1d7-a489-4a29-9e63-56b28e48f7e1" path="/var/lib/kubelet/pods/548ce1d7-a489-4a29-9e63-56b28e48f7e1/volumes" Nov 25 16:25:50 crc kubenswrapper[4743]: I1125 16:25:50.775028 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:25:50 crc kubenswrapper[4743]: E1125 16:25:50.775816 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:26:05 crc kubenswrapper[4743]: I1125 16:26:05.775850 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:26:05 crc kubenswrapper[4743]: E1125 16:26:05.776731 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:26:09 crc kubenswrapper[4743]: I1125 16:26:09.088915 4743 scope.go:117] "RemoveContainer" containerID="033fef5562d777e22b51c09e80c321befea9c2bbdc6b911c6bfabbb7fd4ab60f" Nov 25 16:26:09 crc kubenswrapper[4743]: I1125 16:26:09.114831 4743 scope.go:117] "RemoveContainer" containerID="7df8f285b57c3b4402edc15443199ba89cb81950f6462b68f2876c587a38f0ff" Nov 25 16:26:09 crc kubenswrapper[4743]: I1125 16:26:09.156263 4743 scope.go:117] "RemoveContainer" containerID="f1be27638a5ebffe15bac10257198b404a0f5825bad193f166d7267c34625ce3" Nov 25 16:26:09 crc kubenswrapper[4743]: I1125 16:26:09.231255 4743 scope.go:117] "RemoveContainer" containerID="6e81e903379dd1bb42887a7fcb498d24ffdc267ba0f44ec87102ff49c3e5b2d1" Nov 25 16:26:09 crc kubenswrapper[4743]: I1125 16:26:09.281231 4743 scope.go:117] "RemoveContainer" containerID="7d00eebe9af657c37ba2cb40280dee2d66b7d136106621bc29e6778f8f44a92c" Nov 25 16:26:09 crc kubenswrapper[4743]: I1125 16:26:09.322920 4743 scope.go:117] "RemoveContainer" containerID="c6e837874ff915e5c536412cb3da3cd182850c1c913b1c73f86a43bd12f82377" Nov 25 16:26:09 crc kubenswrapper[4743]: I1125 16:26:09.365689 4743 scope.go:117] "RemoveContainer" containerID="0d8927e45b70c2d03b6e2c4c1d1d644efa968af62e197c42c679207f8940c293" Nov 25 16:26:09 crc kubenswrapper[4743]: I1125 16:26:09.391954 4743 scope.go:117] "RemoveContainer" containerID="e946d2beaf2e652e9b437f9f865a3ebb63c21a0d4cc46ac8e5786ba2ea1c6994" Nov 25 16:26:15 crc kubenswrapper[4743]: I1125 16:26:15.042415 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-wdbn9"] Nov 25 16:26:15 crc kubenswrapper[4743]: I1125 16:26:15.054047 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-wdbn9"] Nov 25 16:26:15 crc kubenswrapper[4743]: I1125 16:26:15.789791 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24abed0a-5ed2-486b-ace3-d1b07ee69e5f" path="/var/lib/kubelet/pods/24abed0a-5ed2-486b-ace3-d1b07ee69e5f/volumes" Nov 25 16:26:16 crc kubenswrapper[4743]: I1125 16:26:16.038615 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qx98x"] Nov 25 16:26:16 crc kubenswrapper[4743]: I1125 16:26:16.047428 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qx98x"] Nov 25 16:26:17 crc kubenswrapper[4743]: I1125 16:26:17.788212 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bd5b52c-e448-4f66-ab08-4c24f541412d" path="/var/lib/kubelet/pods/7bd5b52c-e448-4f66-ab08-4c24f541412d/volumes" Nov 25 16:26:18 crc kubenswrapper[4743]: I1125 16:26:18.775044 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:26:18 crc kubenswrapper[4743]: E1125 16:26:18.775428 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:26:32 crc kubenswrapper[4743]: I1125 16:26:32.046787 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2lqmd"] Nov 25 16:26:32 crc kubenswrapper[4743]: I1125 16:26:32.058208 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2lqmd"] Nov 25 16:26:33 crc kubenswrapper[4743]: I1125 16:26:33.029083 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-mqvnl"] Nov 25 16:26:33 crc kubenswrapper[4743]: I1125 16:26:33.040670 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-mqvnl"] Nov 25 16:26:33 crc kubenswrapper[4743]: I1125 16:26:33.776449 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:26:33 crc kubenswrapper[4743]: E1125 16:26:33.776731 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:26:33 crc kubenswrapper[4743]: I1125 16:26:33.788690 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a037faa-f5b9-4abb-8132-2750befdf031" path="/var/lib/kubelet/pods/5a037faa-f5b9-4abb-8132-2750befdf031/volumes" Nov 25 16:26:33 crc kubenswrapper[4743]: I1125 16:26:33.790790 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92e09359-debb-49f3-8490-c18e8ca5f63e" path="/var/lib/kubelet/pods/92e09359-debb-49f3-8490-c18e8ca5f63e/volumes" Nov 25 16:26:35 crc kubenswrapper[4743]: I1125 16:26:35.044041 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9vplz"] Nov 25 16:26:35 crc kubenswrapper[4743]: I1125 16:26:35.053040 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9vplz"] Nov 25 16:26:35 crc kubenswrapper[4743]: I1125 16:26:35.785828 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba" path="/var/lib/kubelet/pods/0478ee5a-d51d-4d5c-8bfb-4677ab9e60ba/volumes" Nov 25 16:26:48 crc kubenswrapper[4743]: I1125 16:26:48.776370 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:26:48 crc kubenswrapper[4743]: E1125 16:26:48.777351 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:26:55 crc kubenswrapper[4743]: I1125 16:26:55.670937 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-c64568bc5-svsgq" podUID="fd562da8-2d36-4517-8d73-237580575e98" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Nov 25 16:27:02 crc kubenswrapper[4743]: I1125 16:27:02.775069 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:27:02 crc kubenswrapper[4743]: E1125 16:27:02.775968 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:27:02 crc kubenswrapper[4743]: I1125 16:27:02.987063 4743 generic.go:334] "Generic (PLEG): container finished" podID="370e248d-8977-4a95-ac29-df64918b694b" containerID="0e8cbe87243dafc5ec6b1a4f10797a70e0846ad456e55b0577a441c646a599a5" exitCode=0 Nov 25 16:27:02 crc kubenswrapper[4743]: I1125 16:27:02.987152 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" event={"ID":"370e248d-8977-4a95-ac29-df64918b694b","Type":"ContainerDied","Data":"0e8cbe87243dafc5ec6b1a4f10797a70e0846ad456e55b0577a441c646a599a5"} Nov 25 16:27:04 crc kubenswrapper[4743]: I1125 16:27:04.396306 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" Nov 25 16:27:04 crc kubenswrapper[4743]: I1125 16:27:04.471188 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/370e248d-8977-4a95-ac29-df64918b694b-ssh-key\") pod \"370e248d-8977-4a95-ac29-df64918b694b\" (UID: \"370e248d-8977-4a95-ac29-df64918b694b\") " Nov 25 16:27:04 crc kubenswrapper[4743]: I1125 16:27:04.471253 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk8tc\" (UniqueName: \"kubernetes.io/projected/370e248d-8977-4a95-ac29-df64918b694b-kube-api-access-bk8tc\") pod \"370e248d-8977-4a95-ac29-df64918b694b\" (UID: \"370e248d-8977-4a95-ac29-df64918b694b\") " Nov 25 16:27:04 crc kubenswrapper[4743]: I1125 16:27:04.471272 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/370e248d-8977-4a95-ac29-df64918b694b-inventory\") pod \"370e248d-8977-4a95-ac29-df64918b694b\" (UID: \"370e248d-8977-4a95-ac29-df64918b694b\") " Nov 25 16:27:04 crc kubenswrapper[4743]: I1125 16:27:04.477950 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/370e248d-8977-4a95-ac29-df64918b694b-kube-api-access-bk8tc" (OuterVolumeSpecName: "kube-api-access-bk8tc") pod "370e248d-8977-4a95-ac29-df64918b694b" (UID: "370e248d-8977-4a95-ac29-df64918b694b"). InnerVolumeSpecName "kube-api-access-bk8tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:27:04 crc kubenswrapper[4743]: I1125 16:27:04.504894 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/370e248d-8977-4a95-ac29-df64918b694b-inventory" (OuterVolumeSpecName: "inventory") pod "370e248d-8977-4a95-ac29-df64918b694b" (UID: "370e248d-8977-4a95-ac29-df64918b694b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:27:04 crc kubenswrapper[4743]: I1125 16:27:04.504995 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/370e248d-8977-4a95-ac29-df64918b694b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "370e248d-8977-4a95-ac29-df64918b694b" (UID: "370e248d-8977-4a95-ac29-df64918b694b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:27:04 crc kubenswrapper[4743]: I1125 16:27:04.574760 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/370e248d-8977-4a95-ac29-df64918b694b-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:27:04 crc kubenswrapper[4743]: I1125 16:27:04.574804 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk8tc\" (UniqueName: \"kubernetes.io/projected/370e248d-8977-4a95-ac29-df64918b694b-kube-api-access-bk8tc\") on node \"crc\" DevicePath \"\"" Nov 25 16:27:04 crc kubenswrapper[4743]: I1125 16:27:04.574818 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/370e248d-8977-4a95-ac29-df64918b694b-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.013137 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" event={"ID":"370e248d-8977-4a95-ac29-df64918b694b","Type":"ContainerDied","Data":"6a7c75f77872b9b0da563bb16cd905336c68202847c682eff170553da15d27a3"} Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.013454 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a7c75f77872b9b0da563bb16cd905336c68202847c682eff170553da15d27a3" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.013511 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.056789 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-b457-account-create-pwttw"] Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.064314 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hbmzx"] Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.073079 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-rz82t"] Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.080298 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-0cdb-account-create-4lr9x"] Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.090718 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c3b4-account-create-vhtfp"] Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.099286 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hbmzx"] Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.108277 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-rz82t"] Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.118023 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-0cdb-account-create-4lr9x"] Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.126611 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-b457-account-create-pwttw"] Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.134340 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-c3b4-account-create-vhtfp"] Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.141925 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-hn4tp"] Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.149513 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-hn4tp"] Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.157053 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62"] Nov 25 16:27:05 crc kubenswrapper[4743]: E1125 16:27:05.157824 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370e248d-8977-4a95-ac29-df64918b694b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.157844 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="370e248d-8977-4a95-ac29-df64918b694b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.158006 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="370e248d-8977-4a95-ac29-df64918b694b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.158634 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.160847 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.161221 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.161411 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.161579 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.165610 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62"] Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.286647 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf95749b-9f3b-4df2-afaf-869ec45e1807-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pzm62\" (UID: \"cf95749b-9f3b-4df2-afaf-869ec45e1807\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.286724 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf95749b-9f3b-4df2-afaf-869ec45e1807-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pzm62\" (UID: \"cf95749b-9f3b-4df2-afaf-869ec45e1807\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.287325 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhjf6\" (UniqueName: \"kubernetes.io/projected/cf95749b-9f3b-4df2-afaf-869ec45e1807-kube-api-access-mhjf6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pzm62\" (UID: \"cf95749b-9f3b-4df2-afaf-869ec45e1807\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.389425 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhjf6\" (UniqueName: \"kubernetes.io/projected/cf95749b-9f3b-4df2-afaf-869ec45e1807-kube-api-access-mhjf6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pzm62\" (UID: \"cf95749b-9f3b-4df2-afaf-869ec45e1807\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.389479 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf95749b-9f3b-4df2-afaf-869ec45e1807-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pzm62\" (UID: \"cf95749b-9f3b-4df2-afaf-869ec45e1807\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.389513 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf95749b-9f3b-4df2-afaf-869ec45e1807-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pzm62\" (UID: \"cf95749b-9f3b-4df2-afaf-869ec45e1807\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.394258 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf95749b-9f3b-4df2-afaf-869ec45e1807-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pzm62\" (UID: \"cf95749b-9f3b-4df2-afaf-869ec45e1807\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.394654 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf95749b-9f3b-4df2-afaf-869ec45e1807-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pzm62\" (UID: \"cf95749b-9f3b-4df2-afaf-869ec45e1807\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.408841 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhjf6\" (UniqueName: \"kubernetes.io/projected/cf95749b-9f3b-4df2-afaf-869ec45e1807-kube-api-access-mhjf6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pzm62\" (UID: \"cf95749b-9f3b-4df2-afaf-869ec45e1807\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.478776 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.784866 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03a19749-2d18-460c-af7b-e3539fde228c" path="/var/lib/kubelet/pods/03a19749-2d18-460c-af7b-e3539fde228c/volumes" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.785820 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58dc0431-6633-41cc-9de5-9f18e85f82c1" path="/var/lib/kubelet/pods/58dc0431-6633-41cc-9de5-9f18e85f82c1/volumes" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.786383 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="814cd753-a1ab-45d6-9eb8-239b998f43ac" path="/var/lib/kubelet/pods/814cd753-a1ab-45d6-9eb8-239b998f43ac/volumes" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.786904 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a30cfb3-c372-4a9a-a444-573a26493643" path="/var/lib/kubelet/pods/8a30cfb3-c372-4a9a-a444-573a26493643/volumes" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.787903 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c" path="/var/lib/kubelet/pods/8b4c1dc5-1b84-45ad-ba3f-d0b13a40706c/volumes" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.788403 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d16b76e-dc2a-456a-aaed-79f8338caaa9" path="/var/lib/kubelet/pods/9d16b76e-dc2a-456a-aaed-79f8338caaa9/volumes" Nov 25 16:27:05 crc kubenswrapper[4743]: I1125 16:27:05.978382 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62"] Nov 25 16:27:05 crc kubenswrapper[4743]: W1125 16:27:05.988728 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf95749b_9f3b_4df2_afaf_869ec45e1807.slice/crio-e10ed421425c620e71b31b2e2127f463e71b4977c396edd74919452b8a887db1 WatchSource:0}: Error finding container e10ed421425c620e71b31b2e2127f463e71b4977c396edd74919452b8a887db1: Status 404 returned error can't find the container with id e10ed421425c620e71b31b2e2127f463e71b4977c396edd74919452b8a887db1 Nov 25 16:27:06 crc kubenswrapper[4743]: I1125 16:27:06.023503 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" event={"ID":"cf95749b-9f3b-4df2-afaf-869ec45e1807","Type":"ContainerStarted","Data":"e10ed421425c620e71b31b2e2127f463e71b4977c396edd74919452b8a887db1"} Nov 25 16:27:07 crc kubenswrapper[4743]: I1125 16:27:07.035888 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" event={"ID":"cf95749b-9f3b-4df2-afaf-869ec45e1807","Type":"ContainerStarted","Data":"2d0f3a16e928ef4f6526d456355703a223300d251c2bb216751aced4a54c58e7"} Nov 25 16:27:07 crc kubenswrapper[4743]: I1125 16:27:07.072231 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" podStartSLOduration=1.647055329 podStartE2EDuration="2.072208641s" podCreationTimestamp="2025-11-25 16:27:05 +0000 UTC" firstStartedPulling="2025-11-25 16:27:05.994535135 +0000 UTC m=+1705.116374684" lastFinishedPulling="2025-11-25 16:27:06.419688437 +0000 UTC m=+1705.541527996" observedRunningTime="2025-11-25 16:27:07.065484979 +0000 UTC m=+1706.187324528" watchObservedRunningTime="2025-11-25 16:27:07.072208641 +0000 UTC m=+1706.194048190" Nov 25 16:27:09 crc kubenswrapper[4743]: I1125 16:27:09.533078 4743 scope.go:117] "RemoveContainer" containerID="9ee9e9de02a98ab5efcaf524bd46dfeaa0769284a00cafa5cda9abedfc1a3381" Nov 25 16:27:09 crc kubenswrapper[4743]: I1125 16:27:09.567312 4743 scope.go:117] "RemoveContainer" containerID="f4839caa89b2d5d190088fa2bcc56bd460ad24455e645b4aa8df374ccc425705" Nov 25 16:27:09 crc kubenswrapper[4743]: I1125 16:27:09.602967 4743 scope.go:117] "RemoveContainer" containerID="4f9401c46bd94810787e0917f1f25545dd02e23c768a8676777f641e4ec858f2" Nov 25 16:27:09 crc kubenswrapper[4743]: I1125 16:27:09.654734 4743 scope.go:117] "RemoveContainer" containerID="c95a6e8287be81a1fcbfcea5065b5414aca80bba0026c5f9b4fb904c2818834d" Nov 25 16:27:09 crc kubenswrapper[4743]: I1125 16:27:09.700234 4743 scope.go:117] "RemoveContainer" containerID="6304cc9db27c76879661d45840a3877910738087c85af9ed315c1a8a4bf37f0c" Nov 25 16:27:09 crc kubenswrapper[4743]: I1125 16:27:09.762569 4743 scope.go:117] "RemoveContainer" containerID="ebf27c6a18f7a9320091c1120b1d5b3676bc26f10881742dfddd3738b3f818a4" Nov 25 16:27:09 crc kubenswrapper[4743]: I1125 16:27:09.799505 4743 scope.go:117] "RemoveContainer" containerID="69e370f1259b514c151bb504c5ed40e2e0b3dea122ce69771797f09b34dbcde3" Nov 25 16:27:09 crc kubenswrapper[4743]: I1125 16:27:09.818665 4743 scope.go:117] "RemoveContainer" containerID="52a815fadde519998c3a62340342da1c85661bb301f4810850a3925e08989d77" Nov 25 16:27:09 crc kubenswrapper[4743]: I1125 16:27:09.839036 4743 scope.go:117] "RemoveContainer" containerID="eb73eed19eaf8f467c1547d00948479da0a5657dc4d8bfea99addae5c9fcd4e0" Nov 25 16:27:09 crc kubenswrapper[4743]: I1125 16:27:09.874999 4743 scope.go:117] "RemoveContainer" containerID="591f6779b50fcb1835cc05a42164ab8f18ebc40ba20de925362260c2ab31f8ce" Nov 25 16:27:09 crc kubenswrapper[4743]: I1125 16:27:09.918003 4743 scope.go:117] "RemoveContainer" containerID="22161b8d3ffa7ed61b572c2d37f445d76c6c813c368511f7e966c7dd4571704d" Nov 25 16:27:16 crc kubenswrapper[4743]: I1125 16:27:16.774652 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:27:16 crc kubenswrapper[4743]: E1125 16:27:16.775473 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:27:28 crc kubenswrapper[4743]: I1125 16:27:28.042063 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9r9vw"] Nov 25 16:27:28 crc kubenswrapper[4743]: I1125 16:27:28.049527 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9r9vw"] Nov 25 16:27:29 crc kubenswrapper[4743]: I1125 16:27:29.775168 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:27:29 crc kubenswrapper[4743]: E1125 16:27:29.775814 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:27:29 crc kubenswrapper[4743]: I1125 16:27:29.787534 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d49d73-f8be-44ee-a3fc-37612fdb9440" path="/var/lib/kubelet/pods/84d49d73-f8be-44ee-a3fc-37612fdb9440/volumes" Nov 25 16:27:44 crc kubenswrapper[4743]: I1125 16:27:44.774779 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:27:44 crc kubenswrapper[4743]: E1125 16:27:44.775739 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:27:50 crc kubenswrapper[4743]: I1125 16:27:50.042552 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-pccxd"] Nov 25 16:27:50 crc kubenswrapper[4743]: I1125 16:27:50.052459 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-pccxd"] Nov 25 16:27:51 crc kubenswrapper[4743]: I1125 16:27:51.787517 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="451a339d-6ea1-4ce0-a550-fcaad7d83f28" path="/var/lib/kubelet/pods/451a339d-6ea1-4ce0-a550-fcaad7d83f28/volumes" Nov 25 16:27:56 crc kubenswrapper[4743]: I1125 16:27:56.774424 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:27:56 crc kubenswrapper[4743]: E1125 16:27:56.775288 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:28:01 crc kubenswrapper[4743]: I1125 16:28:01.040314 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xssnc"] Nov 25 16:28:01 crc kubenswrapper[4743]: I1125 16:28:01.059778 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xssnc"] Nov 25 16:28:01 crc kubenswrapper[4743]: I1125 16:28:01.787482 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="796f4930-dad8-4c02-9ffa-00df9a6689ff" path="/var/lib/kubelet/pods/796f4930-dad8-4c02-9ffa-00df9a6689ff/volumes" Nov 25 16:28:07 crc kubenswrapper[4743]: I1125 16:28:07.775701 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:28:07 crc kubenswrapper[4743]: E1125 16:28:07.777002 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:28:10 crc kubenswrapper[4743]: I1125 16:28:10.122798 4743 scope.go:117] "RemoveContainer" containerID="87336b290dd47e553fbd823716fa8fbfcb4d48cad500d15d9e16be29952f724e" Nov 25 16:28:10 crc kubenswrapper[4743]: I1125 16:28:10.160424 4743 scope.go:117] "RemoveContainer" containerID="e88c6afbcb1a7d264540f7bf583e05d03e92134479a4a53c4fc27362be2d39b0" Nov 25 16:28:10 crc kubenswrapper[4743]: I1125 16:28:10.204124 4743 scope.go:117] "RemoveContainer" containerID="9ca9adb39873a76fd53f3d453014253693331a45377595054dd03d9f710977f8" Nov 25 16:28:17 crc kubenswrapper[4743]: I1125 16:28:17.669405 4743 generic.go:334] "Generic (PLEG): container finished" podID="cf95749b-9f3b-4df2-afaf-869ec45e1807" containerID="2d0f3a16e928ef4f6526d456355703a223300d251c2bb216751aced4a54c58e7" exitCode=0 Nov 25 16:28:17 crc kubenswrapper[4743]: I1125 16:28:17.669511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" event={"ID":"cf95749b-9f3b-4df2-afaf-869ec45e1807","Type":"ContainerDied","Data":"2d0f3a16e928ef4f6526d456355703a223300d251c2bb216751aced4a54c58e7"} Nov 25 16:28:18 crc kubenswrapper[4743]: I1125 16:28:18.775984 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:28:18 crc kubenswrapper[4743]: E1125 16:28:18.776290 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.048784 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.196957 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf95749b-9f3b-4df2-afaf-869ec45e1807-inventory\") pod \"cf95749b-9f3b-4df2-afaf-869ec45e1807\" (UID: \"cf95749b-9f3b-4df2-afaf-869ec45e1807\") " Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.197076 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf95749b-9f3b-4df2-afaf-869ec45e1807-ssh-key\") pod \"cf95749b-9f3b-4df2-afaf-869ec45e1807\" (UID: \"cf95749b-9f3b-4df2-afaf-869ec45e1807\") " Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.197243 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhjf6\" (UniqueName: \"kubernetes.io/projected/cf95749b-9f3b-4df2-afaf-869ec45e1807-kube-api-access-mhjf6\") pod \"cf95749b-9f3b-4df2-afaf-869ec45e1807\" (UID: \"cf95749b-9f3b-4df2-afaf-869ec45e1807\") " Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.202993 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf95749b-9f3b-4df2-afaf-869ec45e1807-kube-api-access-mhjf6" (OuterVolumeSpecName: "kube-api-access-mhjf6") pod "cf95749b-9f3b-4df2-afaf-869ec45e1807" (UID: "cf95749b-9f3b-4df2-afaf-869ec45e1807"). InnerVolumeSpecName "kube-api-access-mhjf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.227638 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf95749b-9f3b-4df2-afaf-869ec45e1807-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cf95749b-9f3b-4df2-afaf-869ec45e1807" (UID: "cf95749b-9f3b-4df2-afaf-869ec45e1807"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.229850 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf95749b-9f3b-4df2-afaf-869ec45e1807-inventory" (OuterVolumeSpecName: "inventory") pod "cf95749b-9f3b-4df2-afaf-869ec45e1807" (UID: "cf95749b-9f3b-4df2-afaf-869ec45e1807"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.300741 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhjf6\" (UniqueName: \"kubernetes.io/projected/cf95749b-9f3b-4df2-afaf-869ec45e1807-kube-api-access-mhjf6\") on node \"crc\" DevicePath \"\"" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.300794 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf95749b-9f3b-4df2-afaf-869ec45e1807-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.300807 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf95749b-9f3b-4df2-afaf-869ec45e1807-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.689991 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" event={"ID":"cf95749b-9f3b-4df2-afaf-869ec45e1807","Type":"ContainerDied","Data":"e10ed421425c620e71b31b2e2127f463e71b4977c396edd74919452b8a887db1"} Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.690964 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e10ed421425c620e71b31b2e2127f463e71b4977c396edd74919452b8a887db1" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.690069 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pzm62" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.788386 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr"] Nov 25 16:28:19 crc kubenswrapper[4743]: E1125 16:28:19.788756 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf95749b-9f3b-4df2-afaf-869ec45e1807" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.788773 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf95749b-9f3b-4df2-afaf-869ec45e1807" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.789019 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf95749b-9f3b-4df2-afaf-869ec45e1807" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.789787 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr"] Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.789872 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.792162 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.794688 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.794724 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.794969 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.911324 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/865996cb-146d-428e-aff6-7ce31c808ffe-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr\" (UID: \"865996cb-146d-428e-aff6-7ce31c808ffe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.911411 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8m4r\" (UniqueName: \"kubernetes.io/projected/865996cb-146d-428e-aff6-7ce31c808ffe-kube-api-access-g8m4r\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr\" (UID: \"865996cb-146d-428e-aff6-7ce31c808ffe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" Nov 25 16:28:19 crc kubenswrapper[4743]: I1125 16:28:19.911437 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/865996cb-146d-428e-aff6-7ce31c808ffe-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr\" (UID: \"865996cb-146d-428e-aff6-7ce31c808ffe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" Nov 25 16:28:20 crc kubenswrapper[4743]: I1125 16:28:20.013655 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/865996cb-146d-428e-aff6-7ce31c808ffe-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr\" (UID: \"865996cb-146d-428e-aff6-7ce31c808ffe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" Nov 25 16:28:20 crc kubenswrapper[4743]: I1125 16:28:20.013744 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8m4r\" (UniqueName: \"kubernetes.io/projected/865996cb-146d-428e-aff6-7ce31c808ffe-kube-api-access-g8m4r\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr\" (UID: \"865996cb-146d-428e-aff6-7ce31c808ffe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" Nov 25 16:28:20 crc kubenswrapper[4743]: I1125 16:28:20.013777 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/865996cb-146d-428e-aff6-7ce31c808ffe-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr\" (UID: \"865996cb-146d-428e-aff6-7ce31c808ffe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" Nov 25 16:28:20 crc kubenswrapper[4743]: I1125 16:28:20.017761 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/865996cb-146d-428e-aff6-7ce31c808ffe-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr\" (UID: \"865996cb-146d-428e-aff6-7ce31c808ffe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" Nov 25 16:28:20 crc kubenswrapper[4743]: I1125 16:28:20.018448 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/865996cb-146d-428e-aff6-7ce31c808ffe-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr\" (UID: \"865996cb-146d-428e-aff6-7ce31c808ffe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" Nov 25 16:28:20 crc kubenswrapper[4743]: I1125 16:28:20.032029 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8m4r\" (UniqueName: \"kubernetes.io/projected/865996cb-146d-428e-aff6-7ce31c808ffe-kube-api-access-g8m4r\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr\" (UID: \"865996cb-146d-428e-aff6-7ce31c808ffe\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" Nov 25 16:28:20 crc kubenswrapper[4743]: I1125 16:28:20.115358 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" Nov 25 16:28:20 crc kubenswrapper[4743]: I1125 16:28:20.608309 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr"] Nov 25 16:28:20 crc kubenswrapper[4743]: W1125 16:28:20.613736 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod865996cb_146d_428e_aff6_7ce31c808ffe.slice/crio-7bb27a46ffb9e40f058c5a7e8251dc329050b370ea3b2bd83f5017d07baaf46f WatchSource:0}: Error finding container 7bb27a46ffb9e40f058c5a7e8251dc329050b370ea3b2bd83f5017d07baaf46f: Status 404 returned error can't find the container with id 7bb27a46ffb9e40f058c5a7e8251dc329050b370ea3b2bd83f5017d07baaf46f Nov 25 16:28:20 crc kubenswrapper[4743]: I1125 16:28:20.699302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" event={"ID":"865996cb-146d-428e-aff6-7ce31c808ffe","Type":"ContainerStarted","Data":"7bb27a46ffb9e40f058c5a7e8251dc329050b370ea3b2bd83f5017d07baaf46f"} Nov 25 16:28:22 crc kubenswrapper[4743]: I1125 16:28:22.715971 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" event={"ID":"865996cb-146d-428e-aff6-7ce31c808ffe","Type":"ContainerStarted","Data":"ffc46db5b07d785d0c1f52f40d7b71c22977a0009d8a6bfdf75be32b82705dbb"} Nov 25 16:28:22 crc kubenswrapper[4743]: I1125 16:28:22.736091 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" podStartSLOduration=2.490501793 podStartE2EDuration="3.736075267s" podCreationTimestamp="2025-11-25 16:28:19 +0000 UTC" firstStartedPulling="2025-11-25 16:28:20.615686896 +0000 UTC m=+1779.737526445" lastFinishedPulling="2025-11-25 16:28:21.86126036 +0000 UTC m=+1780.983099919" observedRunningTime="2025-11-25 16:28:22.731316857 +0000 UTC m=+1781.853156406" watchObservedRunningTime="2025-11-25 16:28:22.736075267 +0000 UTC m=+1781.857914806" Nov 25 16:28:27 crc kubenswrapper[4743]: I1125 16:28:27.755136 4743 generic.go:334] "Generic (PLEG): container finished" podID="865996cb-146d-428e-aff6-7ce31c808ffe" containerID="ffc46db5b07d785d0c1f52f40d7b71c22977a0009d8a6bfdf75be32b82705dbb" exitCode=0 Nov 25 16:28:27 crc kubenswrapper[4743]: I1125 16:28:27.755222 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" event={"ID":"865996cb-146d-428e-aff6-7ce31c808ffe","Type":"ContainerDied","Data":"ffc46db5b07d785d0c1f52f40d7b71c22977a0009d8a6bfdf75be32b82705dbb"} Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.165840 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.280856 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8m4r\" (UniqueName: \"kubernetes.io/projected/865996cb-146d-428e-aff6-7ce31c808ffe-kube-api-access-g8m4r\") pod \"865996cb-146d-428e-aff6-7ce31c808ffe\" (UID: \"865996cb-146d-428e-aff6-7ce31c808ffe\") " Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.280922 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/865996cb-146d-428e-aff6-7ce31c808ffe-inventory\") pod \"865996cb-146d-428e-aff6-7ce31c808ffe\" (UID: \"865996cb-146d-428e-aff6-7ce31c808ffe\") " Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.281042 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/865996cb-146d-428e-aff6-7ce31c808ffe-ssh-key\") pod \"865996cb-146d-428e-aff6-7ce31c808ffe\" (UID: \"865996cb-146d-428e-aff6-7ce31c808ffe\") " Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.287225 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/865996cb-146d-428e-aff6-7ce31c808ffe-kube-api-access-g8m4r" (OuterVolumeSpecName: "kube-api-access-g8m4r") pod "865996cb-146d-428e-aff6-7ce31c808ffe" (UID: "865996cb-146d-428e-aff6-7ce31c808ffe"). InnerVolumeSpecName "kube-api-access-g8m4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.308908 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/865996cb-146d-428e-aff6-7ce31c808ffe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "865996cb-146d-428e-aff6-7ce31c808ffe" (UID: "865996cb-146d-428e-aff6-7ce31c808ffe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.313731 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/865996cb-146d-428e-aff6-7ce31c808ffe-inventory" (OuterVolumeSpecName: "inventory") pod "865996cb-146d-428e-aff6-7ce31c808ffe" (UID: "865996cb-146d-428e-aff6-7ce31c808ffe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.383185 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8m4r\" (UniqueName: \"kubernetes.io/projected/865996cb-146d-428e-aff6-7ce31c808ffe-kube-api-access-g8m4r\") on node \"crc\" DevicePath \"\"" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.383227 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/865996cb-146d-428e-aff6-7ce31c808ffe-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.383236 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/865996cb-146d-428e-aff6-7ce31c808ffe-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.774565 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:28:29 crc kubenswrapper[4743]: E1125 16:28:29.774964 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.776306 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.785811 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr" event={"ID":"865996cb-146d-428e-aff6-7ce31c808ffe","Type":"ContainerDied","Data":"7bb27a46ffb9e40f058c5a7e8251dc329050b370ea3b2bd83f5017d07baaf46f"} Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.785848 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bb27a46ffb9e40f058c5a7e8251dc329050b370ea3b2bd83f5017d07baaf46f" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.852061 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v"] Nov 25 16:28:29 crc kubenswrapper[4743]: E1125 16:28:29.852752 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="865996cb-146d-428e-aff6-7ce31c808ffe" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.852870 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="865996cb-146d-428e-aff6-7ce31c808ffe" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.853219 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="865996cb-146d-428e-aff6-7ce31c808ffe" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.854013 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.863144 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.863676 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.863920 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.864320 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v"] Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.865193 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.993125 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1c2dd10-3126-4c40-a55f-679ed3441056-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtx8v\" (UID: \"b1c2dd10-3126-4c40-a55f-679ed3441056\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.993203 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1c2dd10-3126-4c40-a55f-679ed3441056-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtx8v\" (UID: \"b1c2dd10-3126-4c40-a55f-679ed3441056\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" Nov 25 16:28:29 crc kubenswrapper[4743]: I1125 16:28:29.993269 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46c7c\" (UniqueName: \"kubernetes.io/projected/b1c2dd10-3126-4c40-a55f-679ed3441056-kube-api-access-46c7c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtx8v\" (UID: \"b1c2dd10-3126-4c40-a55f-679ed3441056\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" Nov 25 16:28:30 crc kubenswrapper[4743]: I1125 16:28:30.095511 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1c2dd10-3126-4c40-a55f-679ed3441056-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtx8v\" (UID: \"b1c2dd10-3126-4c40-a55f-679ed3441056\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" Nov 25 16:28:30 crc kubenswrapper[4743]: I1125 16:28:30.095585 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1c2dd10-3126-4c40-a55f-679ed3441056-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtx8v\" (UID: \"b1c2dd10-3126-4c40-a55f-679ed3441056\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" Nov 25 16:28:30 crc kubenswrapper[4743]: I1125 16:28:30.095647 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46c7c\" (UniqueName: \"kubernetes.io/projected/b1c2dd10-3126-4c40-a55f-679ed3441056-kube-api-access-46c7c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtx8v\" (UID: \"b1c2dd10-3126-4c40-a55f-679ed3441056\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" Nov 25 16:28:30 crc kubenswrapper[4743]: I1125 16:28:30.099808 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1c2dd10-3126-4c40-a55f-679ed3441056-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtx8v\" (UID: \"b1c2dd10-3126-4c40-a55f-679ed3441056\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" Nov 25 16:28:30 crc kubenswrapper[4743]: I1125 16:28:30.099926 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1c2dd10-3126-4c40-a55f-679ed3441056-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtx8v\" (UID: \"b1c2dd10-3126-4c40-a55f-679ed3441056\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" Nov 25 16:28:30 crc kubenswrapper[4743]: I1125 16:28:30.111647 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46c7c\" (UniqueName: \"kubernetes.io/projected/b1c2dd10-3126-4c40-a55f-679ed3441056-kube-api-access-46c7c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtx8v\" (UID: \"b1c2dd10-3126-4c40-a55f-679ed3441056\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" Nov 25 16:28:30 crc kubenswrapper[4743]: I1125 16:28:30.173407 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" Nov 25 16:28:30 crc kubenswrapper[4743]: I1125 16:28:30.655554 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v"] Nov 25 16:28:30 crc kubenswrapper[4743]: I1125 16:28:30.787857 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" event={"ID":"b1c2dd10-3126-4c40-a55f-679ed3441056","Type":"ContainerStarted","Data":"dbaae98958c64645c932b62c0fa8003b00d5c7a865275e0263d11b58f07f9857"} Nov 25 16:28:31 crc kubenswrapper[4743]: I1125 16:28:31.798081 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" event={"ID":"b1c2dd10-3126-4c40-a55f-679ed3441056","Type":"ContainerStarted","Data":"0dda1b0432586ea463507768ae91a8d0a1a74a71ab918feeabc79bd2d6da1e84"} Nov 25 16:28:31 crc kubenswrapper[4743]: I1125 16:28:31.825261 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" podStartSLOduration=2.175032984 podStartE2EDuration="2.825241226s" podCreationTimestamp="2025-11-25 16:28:29 +0000 UTC" firstStartedPulling="2025-11-25 16:28:30.658455422 +0000 UTC m=+1789.780294971" lastFinishedPulling="2025-11-25 16:28:31.308663664 +0000 UTC m=+1790.430503213" observedRunningTime="2025-11-25 16:28:31.817029437 +0000 UTC m=+1790.938869006" watchObservedRunningTime="2025-11-25 16:28:31.825241226 +0000 UTC m=+1790.947080775" Nov 25 16:28:37 crc kubenswrapper[4743]: I1125 16:28:37.046885 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-q2cg7"] Nov 25 16:28:37 crc kubenswrapper[4743]: I1125 16:28:37.057199 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-q2cg7"] Nov 25 16:28:37 crc kubenswrapper[4743]: I1125 16:28:37.790962 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd28a285-f7b2-4b46-99d5-bf60741558f6" path="/var/lib/kubelet/pods/cd28a285-f7b2-4b46-99d5-bf60741558f6/volumes" Nov 25 16:28:40 crc kubenswrapper[4743]: I1125 16:28:40.774669 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:28:40 crc kubenswrapper[4743]: E1125 16:28:40.775818 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:28:53 crc kubenswrapper[4743]: I1125 16:28:53.775148 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:28:53 crc kubenswrapper[4743]: E1125 16:28:53.775848 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:29:06 crc kubenswrapper[4743]: I1125 16:29:06.774686 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:29:06 crc kubenswrapper[4743]: E1125 16:29:06.775393 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:29:10 crc kubenswrapper[4743]: I1125 16:29:10.121193 4743 generic.go:334] "Generic (PLEG): container finished" podID="b1c2dd10-3126-4c40-a55f-679ed3441056" containerID="0dda1b0432586ea463507768ae91a8d0a1a74a71ab918feeabc79bd2d6da1e84" exitCode=0 Nov 25 16:29:10 crc kubenswrapper[4743]: I1125 16:29:10.121252 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" event={"ID":"b1c2dd10-3126-4c40-a55f-679ed3441056","Type":"ContainerDied","Data":"0dda1b0432586ea463507768ae91a8d0a1a74a71ab918feeabc79bd2d6da1e84"} Nov 25 16:29:10 crc kubenswrapper[4743]: I1125 16:29:10.312022 4743 scope.go:117] "RemoveContainer" containerID="2faade60b8f184654a01cfffc8fe45de0a51d1a1d30e824fd896206cd5538749" Nov 25 16:29:11 crc kubenswrapper[4743]: I1125 16:29:11.519752 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" Nov 25 16:29:11 crc kubenswrapper[4743]: I1125 16:29:11.592325 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46c7c\" (UniqueName: \"kubernetes.io/projected/b1c2dd10-3126-4c40-a55f-679ed3441056-kube-api-access-46c7c\") pod \"b1c2dd10-3126-4c40-a55f-679ed3441056\" (UID: \"b1c2dd10-3126-4c40-a55f-679ed3441056\") " Nov 25 16:29:11 crc kubenswrapper[4743]: I1125 16:29:11.592652 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1c2dd10-3126-4c40-a55f-679ed3441056-inventory\") pod \"b1c2dd10-3126-4c40-a55f-679ed3441056\" (UID: \"b1c2dd10-3126-4c40-a55f-679ed3441056\") " Nov 25 16:29:11 crc kubenswrapper[4743]: I1125 16:29:11.592720 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1c2dd10-3126-4c40-a55f-679ed3441056-ssh-key\") pod \"b1c2dd10-3126-4c40-a55f-679ed3441056\" (UID: \"b1c2dd10-3126-4c40-a55f-679ed3441056\") " Nov 25 16:29:11 crc kubenswrapper[4743]: I1125 16:29:11.600784 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1c2dd10-3126-4c40-a55f-679ed3441056-kube-api-access-46c7c" (OuterVolumeSpecName: "kube-api-access-46c7c") pod "b1c2dd10-3126-4c40-a55f-679ed3441056" (UID: "b1c2dd10-3126-4c40-a55f-679ed3441056"). InnerVolumeSpecName "kube-api-access-46c7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:29:11 crc kubenswrapper[4743]: I1125 16:29:11.624957 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1c2dd10-3126-4c40-a55f-679ed3441056-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b1c2dd10-3126-4c40-a55f-679ed3441056" (UID: "b1c2dd10-3126-4c40-a55f-679ed3441056"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:29:11 crc kubenswrapper[4743]: I1125 16:29:11.630815 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1c2dd10-3126-4c40-a55f-679ed3441056-inventory" (OuterVolumeSpecName: "inventory") pod "b1c2dd10-3126-4c40-a55f-679ed3441056" (UID: "b1c2dd10-3126-4c40-a55f-679ed3441056"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:29:11 crc kubenswrapper[4743]: I1125 16:29:11.694749 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b1c2dd10-3126-4c40-a55f-679ed3441056-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:29:11 crc kubenswrapper[4743]: I1125 16:29:11.695080 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46c7c\" (UniqueName: \"kubernetes.io/projected/b1c2dd10-3126-4c40-a55f-679ed3441056-kube-api-access-46c7c\") on node \"crc\" DevicePath \"\"" Nov 25 16:29:11 crc kubenswrapper[4743]: I1125 16:29:11.695094 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1c2dd10-3126-4c40-a55f-679ed3441056-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.138051 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" event={"ID":"b1c2dd10-3126-4c40-a55f-679ed3441056","Type":"ContainerDied","Data":"dbaae98958c64645c932b62c0fa8003b00d5c7a865275e0263d11b58f07f9857"} Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.138092 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbaae98958c64645c932b62c0fa8003b00d5c7a865275e0263d11b58f07f9857" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.138367 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtx8v" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.216555 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt"] Nov 25 16:29:12 crc kubenswrapper[4743]: E1125 16:29:12.217083 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c2dd10-3126-4c40-a55f-679ed3441056" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.217108 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c2dd10-3126-4c40-a55f-679ed3441056" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.217372 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1c2dd10-3126-4c40-a55f-679ed3441056" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.218187 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.220701 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.220915 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.221376 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.221547 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.227280 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt"] Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.304856 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svhmt\" (UniqueName: \"kubernetes.io/projected/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-kube-api-access-svhmt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt\" (UID: \"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.304922 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt\" (UID: \"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.305060 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt\" (UID: \"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.406752 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt\" (UID: \"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.406849 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svhmt\" (UniqueName: \"kubernetes.io/projected/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-kube-api-access-svhmt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt\" (UID: \"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.406884 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt\" (UID: \"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.411133 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt\" (UID: \"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.412106 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt\" (UID: \"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.424814 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svhmt\" (UniqueName: \"kubernetes.io/projected/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-kube-api-access-svhmt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt\" (UID: \"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" Nov 25 16:29:12 crc kubenswrapper[4743]: I1125 16:29:12.543167 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" Nov 25 16:29:13 crc kubenswrapper[4743]: I1125 16:29:13.055198 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt"] Nov 25 16:29:13 crc kubenswrapper[4743]: I1125 16:29:13.147215 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" event={"ID":"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0","Type":"ContainerStarted","Data":"ccc5283daa36b789764150f9a7764a4e2def83362e6989ad726b7666b018bfb7"} Nov 25 16:29:14 crc kubenswrapper[4743]: I1125 16:29:14.156355 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" event={"ID":"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0","Type":"ContainerStarted","Data":"ec99ad32ecc8a5f26f5dc8634a6058cfd29b2c851fc8adf19e09f3dc79fda435"} Nov 25 16:29:14 crc kubenswrapper[4743]: I1125 16:29:14.178105 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" podStartSLOduration=1.7486391060000002 podStartE2EDuration="2.178083914s" podCreationTimestamp="2025-11-25 16:29:12 +0000 UTC" firstStartedPulling="2025-11-25 16:29:13.060016925 +0000 UTC m=+1832.181856474" lastFinishedPulling="2025-11-25 16:29:13.489461733 +0000 UTC m=+1832.611301282" observedRunningTime="2025-11-25 16:29:14.172543699 +0000 UTC m=+1833.294383248" watchObservedRunningTime="2025-11-25 16:29:14.178083914 +0000 UTC m=+1833.299923453" Nov 25 16:29:20 crc kubenswrapper[4743]: I1125 16:29:20.774701 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:29:20 crc kubenswrapper[4743]: E1125 16:29:20.775446 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:29:32 crc kubenswrapper[4743]: I1125 16:29:32.774769 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:29:32 crc kubenswrapper[4743]: E1125 16:29:32.775914 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:29:43 crc kubenswrapper[4743]: I1125 16:29:43.775401 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:29:43 crc kubenswrapper[4743]: E1125 16:29:43.776274 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:29:57 crc kubenswrapper[4743]: I1125 16:29:57.775585 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:29:58 crc kubenswrapper[4743]: I1125 16:29:58.518644 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"61ddd1cf4766f7f9fdf9d1bbccdeb4e1e763abd124bcc4fda1e5e4965acde9ac"} Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.144720 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc"] Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.146505 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.149909 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.150057 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.153373 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc"] Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.260305 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-secret-volume\") pod \"collect-profiles-29401470-hrrxc\" (UID: \"ee3bd932-56f5-449a-a11e-0d41ffb55aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.260828 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-config-volume\") pod \"collect-profiles-29401470-hrrxc\" (UID: \"ee3bd932-56f5-449a-a11e-0d41ffb55aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.261338 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9zhm\" (UniqueName: \"kubernetes.io/projected/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-kube-api-access-z9zhm\") pod \"collect-profiles-29401470-hrrxc\" (UID: \"ee3bd932-56f5-449a-a11e-0d41ffb55aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.363109 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9zhm\" (UniqueName: \"kubernetes.io/projected/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-kube-api-access-z9zhm\") pod \"collect-profiles-29401470-hrrxc\" (UID: \"ee3bd932-56f5-449a-a11e-0d41ffb55aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.363242 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-secret-volume\") pod \"collect-profiles-29401470-hrrxc\" (UID: \"ee3bd932-56f5-449a-a11e-0d41ffb55aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.363285 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-config-volume\") pod \"collect-profiles-29401470-hrrxc\" (UID: \"ee3bd932-56f5-449a-a11e-0d41ffb55aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.364418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-config-volume\") pod \"collect-profiles-29401470-hrrxc\" (UID: \"ee3bd932-56f5-449a-a11e-0d41ffb55aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.369813 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-secret-volume\") pod \"collect-profiles-29401470-hrrxc\" (UID: \"ee3bd932-56f5-449a-a11e-0d41ffb55aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.380102 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9zhm\" (UniqueName: \"kubernetes.io/projected/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-kube-api-access-z9zhm\") pod \"collect-profiles-29401470-hrrxc\" (UID: \"ee3bd932-56f5-449a-a11e-0d41ffb55aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.471052 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" Nov 25 16:30:00 crc kubenswrapper[4743]: I1125 16:30:00.888929 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc"] Nov 25 16:30:00 crc kubenswrapper[4743]: W1125 16:30:00.896260 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee3bd932_56f5_449a_a11e_0d41ffb55aa5.slice/crio-8250304fc2ff526fb9a6c3dc3fc38a8757f5c4796878d01e2eb62c92e82f171f WatchSource:0}: Error finding container 8250304fc2ff526fb9a6c3dc3fc38a8757f5c4796878d01e2eb62c92e82f171f: Status 404 returned error can't find the container with id 8250304fc2ff526fb9a6c3dc3fc38a8757f5c4796878d01e2eb62c92e82f171f Nov 25 16:30:01 crc kubenswrapper[4743]: I1125 16:30:01.545575 4743 generic.go:334] "Generic (PLEG): container finished" podID="ee3bd932-56f5-449a-a11e-0d41ffb55aa5" containerID="75ad86ee2deb12c8d4ef0984c407011996bdeccd2013bb9c679f16491d08b565" exitCode=0 Nov 25 16:30:01 crc kubenswrapper[4743]: I1125 16:30:01.545668 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" event={"ID":"ee3bd932-56f5-449a-a11e-0d41ffb55aa5","Type":"ContainerDied","Data":"75ad86ee2deb12c8d4ef0984c407011996bdeccd2013bb9c679f16491d08b565"} Nov 25 16:30:01 crc kubenswrapper[4743]: I1125 16:30:01.545916 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" event={"ID":"ee3bd932-56f5-449a-a11e-0d41ffb55aa5","Type":"ContainerStarted","Data":"8250304fc2ff526fb9a6c3dc3fc38a8757f5c4796878d01e2eb62c92e82f171f"} Nov 25 16:30:02 crc kubenswrapper[4743]: I1125 16:30:02.555253 4743 generic.go:334] "Generic (PLEG): container finished" podID="78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0" containerID="ec99ad32ecc8a5f26f5dc8634a6058cfd29b2c851fc8adf19e09f3dc79fda435" exitCode=0 Nov 25 16:30:02 crc kubenswrapper[4743]: I1125 16:30:02.555334 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" event={"ID":"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0","Type":"ContainerDied","Data":"ec99ad32ecc8a5f26f5dc8634a6058cfd29b2c851fc8adf19e09f3dc79fda435"} Nov 25 16:30:02 crc kubenswrapper[4743]: I1125 16:30:02.909760 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" Nov 25 16:30:03 crc kubenswrapper[4743]: I1125 16:30:03.014618 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-secret-volume\") pod \"ee3bd932-56f5-449a-a11e-0d41ffb55aa5\" (UID: \"ee3bd932-56f5-449a-a11e-0d41ffb55aa5\") " Nov 25 16:30:03 crc kubenswrapper[4743]: I1125 16:30:03.014994 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9zhm\" (UniqueName: \"kubernetes.io/projected/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-kube-api-access-z9zhm\") pod \"ee3bd932-56f5-449a-a11e-0d41ffb55aa5\" (UID: \"ee3bd932-56f5-449a-a11e-0d41ffb55aa5\") " Nov 25 16:30:03 crc kubenswrapper[4743]: I1125 16:30:03.015161 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-config-volume\") pod \"ee3bd932-56f5-449a-a11e-0d41ffb55aa5\" (UID: \"ee3bd932-56f5-449a-a11e-0d41ffb55aa5\") " Nov 25 16:30:03 crc kubenswrapper[4743]: I1125 16:30:03.015846 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-config-volume" (OuterVolumeSpecName: "config-volume") pod "ee3bd932-56f5-449a-a11e-0d41ffb55aa5" (UID: "ee3bd932-56f5-449a-a11e-0d41ffb55aa5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:30:03 crc kubenswrapper[4743]: I1125 16:30:03.020505 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-kube-api-access-z9zhm" (OuterVolumeSpecName: "kube-api-access-z9zhm") pod "ee3bd932-56f5-449a-a11e-0d41ffb55aa5" (UID: "ee3bd932-56f5-449a-a11e-0d41ffb55aa5"). InnerVolumeSpecName "kube-api-access-z9zhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:30:03 crc kubenswrapper[4743]: I1125 16:30:03.020675 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ee3bd932-56f5-449a-a11e-0d41ffb55aa5" (UID: "ee3bd932-56f5-449a-a11e-0d41ffb55aa5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:30:03 crc kubenswrapper[4743]: I1125 16:30:03.117327 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 16:30:03 crc kubenswrapper[4743]: I1125 16:30:03.117365 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9zhm\" (UniqueName: \"kubernetes.io/projected/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-kube-api-access-z9zhm\") on node \"crc\" DevicePath \"\"" Nov 25 16:30:03 crc kubenswrapper[4743]: I1125 16:30:03.117379 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee3bd932-56f5-449a-a11e-0d41ffb55aa5-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 16:30:03 crc kubenswrapper[4743]: I1125 16:30:03.565107 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" Nov 25 16:30:03 crc kubenswrapper[4743]: I1125 16:30:03.565119 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401470-hrrxc" event={"ID":"ee3bd932-56f5-449a-a11e-0d41ffb55aa5","Type":"ContainerDied","Data":"8250304fc2ff526fb9a6c3dc3fc38a8757f5c4796878d01e2eb62c92e82f171f"} Nov 25 16:30:03 crc kubenswrapper[4743]: I1125 16:30:03.565169 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8250304fc2ff526fb9a6c3dc3fc38a8757f5c4796878d01e2eb62c92e82f171f" Nov 25 16:30:03 crc kubenswrapper[4743]: I1125 16:30:03.941323 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.033123 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svhmt\" (UniqueName: \"kubernetes.io/projected/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-kube-api-access-svhmt\") pod \"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0\" (UID: \"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0\") " Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.033278 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-inventory\") pod \"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0\" (UID: \"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0\") " Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.033425 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-ssh-key\") pod \"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0\" (UID: \"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0\") " Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.037758 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-kube-api-access-svhmt" (OuterVolumeSpecName: "kube-api-access-svhmt") pod "78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0" (UID: "78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0"). InnerVolumeSpecName "kube-api-access-svhmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.062371 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0" (UID: "78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.063186 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-inventory" (OuterVolumeSpecName: "inventory") pod "78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0" (UID: "78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.136504 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svhmt\" (UniqueName: \"kubernetes.io/projected/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-kube-api-access-svhmt\") on node \"crc\" DevicePath \"\"" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.136788 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.136986 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.580528 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" event={"ID":"78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0","Type":"ContainerDied","Data":"ccc5283daa36b789764150f9a7764a4e2def83362e6989ad726b7666b018bfb7"} Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.580570 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.580568 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccc5283daa36b789764150f9a7764a4e2def83362e6989ad726b7666b018bfb7" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.668951 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vd9jg"] Nov 25 16:30:04 crc kubenswrapper[4743]: E1125 16:30:04.670545 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3bd932-56f5-449a-a11e-0d41ffb55aa5" containerName="collect-profiles" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.670571 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3bd932-56f5-449a-a11e-0d41ffb55aa5" containerName="collect-profiles" Nov 25 16:30:04 crc kubenswrapper[4743]: E1125 16:30:04.670620 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.670631 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.670873 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.670894 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee3bd932-56f5-449a-a11e-0d41ffb55aa5" containerName="collect-profiles" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.671693 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.676074 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.676462 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.677789 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.680665 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.686184 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vd9jg"] Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.849283 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vd9jg\" (UID: \"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.849499 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vd9jg\" (UID: \"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.849538 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ndgg\" (UniqueName: \"kubernetes.io/projected/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-kube-api-access-9ndgg\") pod \"ssh-known-hosts-edpm-deployment-vd9jg\" (UID: \"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.951727 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vd9jg\" (UID: \"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.951791 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ndgg\" (UniqueName: \"kubernetes.io/projected/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-kube-api-access-9ndgg\") pod \"ssh-known-hosts-edpm-deployment-vd9jg\" (UID: \"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.951863 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vd9jg\" (UID: \"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.957255 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vd9jg\" (UID: \"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.957301 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vd9jg\" (UID: \"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" Nov 25 16:30:04 crc kubenswrapper[4743]: I1125 16:30:04.969741 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ndgg\" (UniqueName: \"kubernetes.io/projected/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-kube-api-access-9ndgg\") pod \"ssh-known-hosts-edpm-deployment-vd9jg\" (UID: \"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5\") " pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" Nov 25 16:30:05 crc kubenswrapper[4743]: I1125 16:30:05.023186 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" Nov 25 16:30:05 crc kubenswrapper[4743]: I1125 16:30:05.518832 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vd9jg"] Nov 25 16:30:05 crc kubenswrapper[4743]: W1125 16:30:05.522303 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3738fd51_cc5f_4837_8a29_3dc3d3bbcfd5.slice/crio-84ecf66892ec6346bde5d1cefd5fd18fa39f1a1d959261b2f79e1e77c0e5a68a WatchSource:0}: Error finding container 84ecf66892ec6346bde5d1cefd5fd18fa39f1a1d959261b2f79e1e77c0e5a68a: Status 404 returned error can't find the container with id 84ecf66892ec6346bde5d1cefd5fd18fa39f1a1d959261b2f79e1e77c0e5a68a Nov 25 16:30:05 crc kubenswrapper[4743]: I1125 16:30:05.525899 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 16:30:05 crc kubenswrapper[4743]: I1125 16:30:05.599877 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" event={"ID":"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5","Type":"ContainerStarted","Data":"84ecf66892ec6346bde5d1cefd5fd18fa39f1a1d959261b2f79e1e77c0e5a68a"} Nov 25 16:30:06 crc kubenswrapper[4743]: I1125 16:30:06.609371 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" event={"ID":"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5","Type":"ContainerStarted","Data":"4ee9813c50f076bd26119c1a71bce47c5b60e2587518f4ed7c000b9cbff8425d"} Nov 25 16:30:06 crc kubenswrapper[4743]: I1125 16:30:06.635031 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" podStartSLOduration=2.156254946 podStartE2EDuration="2.635011218s" podCreationTimestamp="2025-11-25 16:30:04 +0000 UTC" firstStartedPulling="2025-11-25 16:30:05.525623664 +0000 UTC m=+1884.647463233" lastFinishedPulling="2025-11-25 16:30:06.004379956 +0000 UTC m=+1885.126219505" observedRunningTime="2025-11-25 16:30:06.624904088 +0000 UTC m=+1885.746743657" watchObservedRunningTime="2025-11-25 16:30:06.635011218 +0000 UTC m=+1885.756850767" Nov 25 16:30:12 crc kubenswrapper[4743]: I1125 16:30:12.663368 4743 generic.go:334] "Generic (PLEG): container finished" podID="3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5" containerID="4ee9813c50f076bd26119c1a71bce47c5b60e2587518f4ed7c000b9cbff8425d" exitCode=0 Nov 25 16:30:12 crc kubenswrapper[4743]: I1125 16:30:12.663428 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" event={"ID":"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5","Type":"ContainerDied","Data":"4ee9813c50f076bd26119c1a71bce47c5b60e2587518f4ed7c000b9cbff8425d"} Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.081458 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.242201 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ndgg\" (UniqueName: \"kubernetes.io/projected/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-kube-api-access-9ndgg\") pod \"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5\" (UID: \"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5\") " Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.242341 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-ssh-key-openstack-edpm-ipam\") pod \"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5\" (UID: \"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5\") " Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.242421 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-inventory-0\") pod \"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5\" (UID: \"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5\") " Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.247208 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-kube-api-access-9ndgg" (OuterVolumeSpecName: "kube-api-access-9ndgg") pod "3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5" (UID: "3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5"). InnerVolumeSpecName "kube-api-access-9ndgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.270254 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5" (UID: "3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.290931 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5" (UID: "3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.345011 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.345044 4743 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.345065 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ndgg\" (UniqueName: \"kubernetes.io/projected/3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5-kube-api-access-9ndgg\") on node \"crc\" DevicePath \"\"" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.680331 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" event={"ID":"3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5","Type":"ContainerDied","Data":"84ecf66892ec6346bde5d1cefd5fd18fa39f1a1d959261b2f79e1e77c0e5a68a"} Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.680689 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84ecf66892ec6346bde5d1cefd5fd18fa39f1a1d959261b2f79e1e77c0e5a68a" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.680394 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vd9jg" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.746981 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46"] Nov 25 16:30:14 crc kubenswrapper[4743]: E1125 16:30:14.747435 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5" containerName="ssh-known-hosts-edpm-deployment" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.747457 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5" containerName="ssh-known-hosts-edpm-deployment" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.747809 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5" containerName="ssh-known-hosts-edpm-deployment" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.748492 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.751538 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.751570 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.751829 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.751971 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.760691 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46"] Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.854222 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqf46\" (UID: \"a2aeec84-22a9-4f07-a1e2-12e61f62f09c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.854307 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn677\" (UniqueName: \"kubernetes.io/projected/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-kube-api-access-fn677\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqf46\" (UID: \"a2aeec84-22a9-4f07-a1e2-12e61f62f09c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.854358 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqf46\" (UID: \"a2aeec84-22a9-4f07-a1e2-12e61f62f09c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.955653 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqf46\" (UID: \"a2aeec84-22a9-4f07-a1e2-12e61f62f09c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.955913 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqf46\" (UID: \"a2aeec84-22a9-4f07-a1e2-12e61f62f09c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.955982 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn677\" (UniqueName: \"kubernetes.io/projected/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-kube-api-access-fn677\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqf46\" (UID: \"a2aeec84-22a9-4f07-a1e2-12e61f62f09c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.959491 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqf46\" (UID: \"a2aeec84-22a9-4f07-a1e2-12e61f62f09c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.959831 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqf46\" (UID: \"a2aeec84-22a9-4f07-a1e2-12e61f62f09c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" Nov 25 16:30:14 crc kubenswrapper[4743]: I1125 16:30:14.972387 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn677\" (UniqueName: \"kubernetes.io/projected/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-kube-api-access-fn677\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gqf46\" (UID: \"a2aeec84-22a9-4f07-a1e2-12e61f62f09c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" Nov 25 16:30:15 crc kubenswrapper[4743]: I1125 16:30:15.065764 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" Nov 25 16:30:15 crc kubenswrapper[4743]: I1125 16:30:15.582982 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46"] Nov 25 16:30:15 crc kubenswrapper[4743]: W1125 16:30:15.588738 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2aeec84_22a9_4f07_a1e2_12e61f62f09c.slice/crio-e9533248cd256ac0d6c55df8e1fbc91ab9f5542433c9c94fbe50f81eae4e9232 WatchSource:0}: Error finding container e9533248cd256ac0d6c55df8e1fbc91ab9f5542433c9c94fbe50f81eae4e9232: Status 404 returned error can't find the container with id e9533248cd256ac0d6c55df8e1fbc91ab9f5542433c9c94fbe50f81eae4e9232 Nov 25 16:30:15 crc kubenswrapper[4743]: I1125 16:30:15.690039 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" event={"ID":"a2aeec84-22a9-4f07-a1e2-12e61f62f09c","Type":"ContainerStarted","Data":"e9533248cd256ac0d6c55df8e1fbc91ab9f5542433c9c94fbe50f81eae4e9232"} Nov 25 16:30:16 crc kubenswrapper[4743]: I1125 16:30:16.719690 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" event={"ID":"a2aeec84-22a9-4f07-a1e2-12e61f62f09c","Type":"ContainerStarted","Data":"fa334be717ae0da14a4c49bff4cf156e349dcf548899ca0012335af9cfd8b1ff"} Nov 25 16:30:16 crc kubenswrapper[4743]: I1125 16:30:16.747212 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" podStartSLOduration=2.335772648 podStartE2EDuration="2.747194745s" podCreationTimestamp="2025-11-25 16:30:14 +0000 UTC" firstStartedPulling="2025-11-25 16:30:15.591863592 +0000 UTC m=+1894.713703141" lastFinishedPulling="2025-11-25 16:30:16.003285689 +0000 UTC m=+1895.125125238" observedRunningTime="2025-11-25 16:30:16.737783858 +0000 UTC m=+1895.859623407" watchObservedRunningTime="2025-11-25 16:30:16.747194745 +0000 UTC m=+1895.869034294" Nov 25 16:30:24 crc kubenswrapper[4743]: I1125 16:30:24.789413 4743 generic.go:334] "Generic (PLEG): container finished" podID="a2aeec84-22a9-4f07-a1e2-12e61f62f09c" containerID="fa334be717ae0da14a4c49bff4cf156e349dcf548899ca0012335af9cfd8b1ff" exitCode=0 Nov 25 16:30:24 crc kubenswrapper[4743]: I1125 16:30:24.789497 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" event={"ID":"a2aeec84-22a9-4f07-a1e2-12e61f62f09c","Type":"ContainerDied","Data":"fa334be717ae0da14a4c49bff4cf156e349dcf548899ca0012335af9cfd8b1ff"} Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.213713 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.357908 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-inventory\") pod \"a2aeec84-22a9-4f07-a1e2-12e61f62f09c\" (UID: \"a2aeec84-22a9-4f07-a1e2-12e61f62f09c\") " Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.358037 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-ssh-key\") pod \"a2aeec84-22a9-4f07-a1e2-12e61f62f09c\" (UID: \"a2aeec84-22a9-4f07-a1e2-12e61f62f09c\") " Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.358192 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn677\" (UniqueName: \"kubernetes.io/projected/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-kube-api-access-fn677\") pod \"a2aeec84-22a9-4f07-a1e2-12e61f62f09c\" (UID: \"a2aeec84-22a9-4f07-a1e2-12e61f62f09c\") " Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.363769 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-kube-api-access-fn677" (OuterVolumeSpecName: "kube-api-access-fn677") pod "a2aeec84-22a9-4f07-a1e2-12e61f62f09c" (UID: "a2aeec84-22a9-4f07-a1e2-12e61f62f09c"). InnerVolumeSpecName "kube-api-access-fn677". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.384791 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-inventory" (OuterVolumeSpecName: "inventory") pod "a2aeec84-22a9-4f07-a1e2-12e61f62f09c" (UID: "a2aeec84-22a9-4f07-a1e2-12e61f62f09c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.385214 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a2aeec84-22a9-4f07-a1e2-12e61f62f09c" (UID: "a2aeec84-22a9-4f07-a1e2-12e61f62f09c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.461215 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.461261 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.461270 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn677\" (UniqueName: \"kubernetes.io/projected/a2aeec84-22a9-4f07-a1e2-12e61f62f09c-kube-api-access-fn677\") on node \"crc\" DevicePath \"\"" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.810833 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" event={"ID":"a2aeec84-22a9-4f07-a1e2-12e61f62f09c","Type":"ContainerDied","Data":"e9533248cd256ac0d6c55df8e1fbc91ab9f5542433c9c94fbe50f81eae4e9232"} Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.811174 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9533248cd256ac0d6c55df8e1fbc91ab9f5542433c9c94fbe50f81eae4e9232" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.810881 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gqf46" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.888029 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt"] Nov 25 16:30:26 crc kubenswrapper[4743]: E1125 16:30:26.888473 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2aeec84-22a9-4f07-a1e2-12e61f62f09c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.888490 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2aeec84-22a9-4f07-a1e2-12e61f62f09c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.888705 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2aeec84-22a9-4f07-a1e2-12e61f62f09c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.889466 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.893257 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.893368 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.893495 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.893609 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:30:26 crc kubenswrapper[4743]: I1125 16:30:26.897222 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt"] Nov 25 16:30:27 crc kubenswrapper[4743]: I1125 16:30:27.071762 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d82431d-8bd6-4d1a-850d-d8c543994421-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt\" (UID: \"0d82431d-8bd6-4d1a-850d-d8c543994421\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" Nov 25 16:30:27 crc kubenswrapper[4743]: I1125 16:30:27.071914 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql6zn\" (UniqueName: \"kubernetes.io/projected/0d82431d-8bd6-4d1a-850d-d8c543994421-kube-api-access-ql6zn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt\" (UID: \"0d82431d-8bd6-4d1a-850d-d8c543994421\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" Nov 25 16:30:27 crc kubenswrapper[4743]: I1125 16:30:27.071960 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d82431d-8bd6-4d1a-850d-d8c543994421-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt\" (UID: \"0d82431d-8bd6-4d1a-850d-d8c543994421\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" Nov 25 16:30:27 crc kubenswrapper[4743]: I1125 16:30:27.173422 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d82431d-8bd6-4d1a-850d-d8c543994421-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt\" (UID: \"0d82431d-8bd6-4d1a-850d-d8c543994421\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" Nov 25 16:30:27 crc kubenswrapper[4743]: I1125 16:30:27.173569 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql6zn\" (UniqueName: \"kubernetes.io/projected/0d82431d-8bd6-4d1a-850d-d8c543994421-kube-api-access-ql6zn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt\" (UID: \"0d82431d-8bd6-4d1a-850d-d8c543994421\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" Nov 25 16:30:27 crc kubenswrapper[4743]: I1125 16:30:27.173630 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d82431d-8bd6-4d1a-850d-d8c543994421-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt\" (UID: \"0d82431d-8bd6-4d1a-850d-d8c543994421\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" Nov 25 16:30:27 crc kubenswrapper[4743]: I1125 16:30:27.179416 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d82431d-8bd6-4d1a-850d-d8c543994421-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt\" (UID: \"0d82431d-8bd6-4d1a-850d-d8c543994421\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" Nov 25 16:30:27 crc kubenswrapper[4743]: I1125 16:30:27.179680 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d82431d-8bd6-4d1a-850d-d8c543994421-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt\" (UID: \"0d82431d-8bd6-4d1a-850d-d8c543994421\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" Nov 25 16:30:27 crc kubenswrapper[4743]: I1125 16:30:27.193849 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql6zn\" (UniqueName: \"kubernetes.io/projected/0d82431d-8bd6-4d1a-850d-d8c543994421-kube-api-access-ql6zn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt\" (UID: \"0d82431d-8bd6-4d1a-850d-d8c543994421\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" Nov 25 16:30:27 crc kubenswrapper[4743]: I1125 16:30:27.207069 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" Nov 25 16:30:27 crc kubenswrapper[4743]: I1125 16:30:27.859625 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt"] Nov 25 16:30:28 crc kubenswrapper[4743]: I1125 16:30:28.839989 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" event={"ID":"0d82431d-8bd6-4d1a-850d-d8c543994421","Type":"ContainerStarted","Data":"11e1305ba67f8adfc09a341bf370f79372ea7495fd9bde24d1f4c3264a3ef9a5"} Nov 25 16:30:29 crc kubenswrapper[4743]: I1125 16:30:29.850932 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" event={"ID":"0d82431d-8bd6-4d1a-850d-d8c543994421","Type":"ContainerStarted","Data":"9496e44cc414e245a930e8471a5bea5161c87a0fd02157ace2bbebfdccd6b217"} Nov 25 16:30:29 crc kubenswrapper[4743]: I1125 16:30:29.867749 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" podStartSLOduration=3.148383042 podStartE2EDuration="3.867731303s" podCreationTimestamp="2025-11-25 16:30:26 +0000 UTC" firstStartedPulling="2025-11-25 16:30:27.867155689 +0000 UTC m=+1906.988995238" lastFinishedPulling="2025-11-25 16:30:28.58650395 +0000 UTC m=+1907.708343499" observedRunningTime="2025-11-25 16:30:29.865624997 +0000 UTC m=+1908.987464546" watchObservedRunningTime="2025-11-25 16:30:29.867731303 +0000 UTC m=+1908.989570852" Nov 25 16:30:38 crc kubenswrapper[4743]: I1125 16:30:38.924960 4743 generic.go:334] "Generic (PLEG): container finished" podID="0d82431d-8bd6-4d1a-850d-d8c543994421" containerID="9496e44cc414e245a930e8471a5bea5161c87a0fd02157ace2bbebfdccd6b217" exitCode=0 Nov 25 16:30:38 crc kubenswrapper[4743]: I1125 16:30:38.925062 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" event={"ID":"0d82431d-8bd6-4d1a-850d-d8c543994421","Type":"ContainerDied","Data":"9496e44cc414e245a930e8471a5bea5161c87a0fd02157ace2bbebfdccd6b217"} Nov 25 16:30:40 crc kubenswrapper[4743]: I1125 16:30:40.343402 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" Nov 25 16:30:40 crc kubenswrapper[4743]: I1125 16:30:40.513547 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d82431d-8bd6-4d1a-850d-d8c543994421-inventory\") pod \"0d82431d-8bd6-4d1a-850d-d8c543994421\" (UID: \"0d82431d-8bd6-4d1a-850d-d8c543994421\") " Nov 25 16:30:40 crc kubenswrapper[4743]: I1125 16:30:40.513782 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d82431d-8bd6-4d1a-850d-d8c543994421-ssh-key\") pod \"0d82431d-8bd6-4d1a-850d-d8c543994421\" (UID: \"0d82431d-8bd6-4d1a-850d-d8c543994421\") " Nov 25 16:30:40 crc kubenswrapper[4743]: I1125 16:30:40.513913 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql6zn\" (UniqueName: \"kubernetes.io/projected/0d82431d-8bd6-4d1a-850d-d8c543994421-kube-api-access-ql6zn\") pod \"0d82431d-8bd6-4d1a-850d-d8c543994421\" (UID: \"0d82431d-8bd6-4d1a-850d-d8c543994421\") " Nov 25 16:30:40 crc kubenswrapper[4743]: I1125 16:30:40.522918 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d82431d-8bd6-4d1a-850d-d8c543994421-kube-api-access-ql6zn" (OuterVolumeSpecName: "kube-api-access-ql6zn") pod "0d82431d-8bd6-4d1a-850d-d8c543994421" (UID: "0d82431d-8bd6-4d1a-850d-d8c543994421"). InnerVolumeSpecName "kube-api-access-ql6zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:30:40 crc kubenswrapper[4743]: I1125 16:30:40.553065 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d82431d-8bd6-4d1a-850d-d8c543994421-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0d82431d-8bd6-4d1a-850d-d8c543994421" (UID: "0d82431d-8bd6-4d1a-850d-d8c543994421"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:30:40 crc kubenswrapper[4743]: I1125 16:30:40.553186 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d82431d-8bd6-4d1a-850d-d8c543994421-inventory" (OuterVolumeSpecName: "inventory") pod "0d82431d-8bd6-4d1a-850d-d8c543994421" (UID: "0d82431d-8bd6-4d1a-850d-d8c543994421"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:30:40 crc kubenswrapper[4743]: I1125 16:30:40.616754 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql6zn\" (UniqueName: \"kubernetes.io/projected/0d82431d-8bd6-4d1a-850d-d8c543994421-kube-api-access-ql6zn\") on node \"crc\" DevicePath \"\"" Nov 25 16:30:40 crc kubenswrapper[4743]: I1125 16:30:40.616799 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d82431d-8bd6-4d1a-850d-d8c543994421-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:30:40 crc kubenswrapper[4743]: I1125 16:30:40.616813 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d82431d-8bd6-4d1a-850d-d8c543994421-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:30:40 crc kubenswrapper[4743]: I1125 16:30:40.948973 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" event={"ID":"0d82431d-8bd6-4d1a-850d-d8c543994421","Type":"ContainerDied","Data":"11e1305ba67f8adfc09a341bf370f79372ea7495fd9bde24d1f4c3264a3ef9a5"} Nov 25 16:30:40 crc kubenswrapper[4743]: I1125 16:30:40.949018 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt" Nov 25 16:30:40 crc kubenswrapper[4743]: I1125 16:30:40.949020 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11e1305ba67f8adfc09a341bf370f79372ea7495fd9bde24d1f4c3264a3ef9a5" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.033845 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689"] Nov 25 16:30:41 crc kubenswrapper[4743]: E1125 16:30:41.034313 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d82431d-8bd6-4d1a-850d-d8c543994421" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.034338 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d82431d-8bd6-4d1a-850d-d8c543994421" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.034558 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d82431d-8bd6-4d1a-850d-d8c543994421" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.035262 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.038021 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.038279 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.038547 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.038694 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.038793 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.038891 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.041230 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.042773 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.045260 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689"] Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.126420 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c648\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-kube-api-access-9c648\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.126937 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.127044 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.127145 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.127227 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.127297 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.127378 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.127489 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.127576 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.127708 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.127854 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.128003 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.128094 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.128179 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.229626 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.229715 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c648\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-kube-api-access-9c648\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.229753 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.229779 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.229833 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.229873 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.229911 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.229941 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.230009 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.230034 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.230068 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.230098 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.230153 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.230201 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.235792 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.235810 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.236120 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.237075 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.238147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.238198 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.238368 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.238405 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.240485 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.240505 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.241351 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.241821 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.242265 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.250863 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c648\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-kube-api-access-9c648\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kf689\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.353938 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.860045 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689"] Nov 25 16:30:41 crc kubenswrapper[4743]: I1125 16:30:41.958142 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" event={"ID":"522738de-cb3a-424d-ae01-b73bd3bcd8c6","Type":"ContainerStarted","Data":"a1c3236999bcb979b32c34ccf4c1774636ae6ca229a23a14214af70d3446729a"} Nov 25 16:30:42 crc kubenswrapper[4743]: I1125 16:30:42.306612 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:30:42 crc kubenswrapper[4743]: I1125 16:30:42.970061 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" event={"ID":"522738de-cb3a-424d-ae01-b73bd3bcd8c6","Type":"ContainerStarted","Data":"2b050b7dc9e59fb436d947a4e4ba19c17777cb47f398fbd9ce4714ae2ff61ea5"} Nov 25 16:30:43 crc kubenswrapper[4743]: I1125 16:30:42.999939 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" podStartSLOduration=2.55967221 podStartE2EDuration="2.999919567s" podCreationTimestamp="2025-11-25 16:30:40 +0000 UTC" firstStartedPulling="2025-11-25 16:30:41.863587624 +0000 UTC m=+1920.985427173" lastFinishedPulling="2025-11-25 16:30:42.303834981 +0000 UTC m=+1921.425674530" observedRunningTime="2025-11-25 16:30:42.994363982 +0000 UTC m=+1922.116203541" watchObservedRunningTime="2025-11-25 16:30:42.999919567 +0000 UTC m=+1922.121759136" Nov 25 16:31:20 crc kubenswrapper[4743]: I1125 16:31:20.265489 4743 generic.go:334] "Generic (PLEG): container finished" podID="522738de-cb3a-424d-ae01-b73bd3bcd8c6" containerID="2b050b7dc9e59fb436d947a4e4ba19c17777cb47f398fbd9ce4714ae2ff61ea5" exitCode=0 Nov 25 16:31:20 crc kubenswrapper[4743]: I1125 16:31:20.265569 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" event={"ID":"522738de-cb3a-424d-ae01-b73bd3bcd8c6","Type":"ContainerDied","Data":"2b050b7dc9e59fb436d947a4e4ba19c17777cb47f398fbd9ce4714ae2ff61ea5"} Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.645365 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.764998 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.765498 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.765543 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.765632 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-ssh-key\") pod \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.765784 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c648\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-kube-api-access-9c648\") pod \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.765820 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-repo-setup-combined-ca-bundle\") pod \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.766511 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.766549 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-inventory\") pod \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.766580 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-ovn-combined-ca-bundle\") pod \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.766647 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-telemetry-combined-ca-bundle\") pod \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.766683 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-libvirt-combined-ca-bundle\") pod \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.766714 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-bootstrap-combined-ca-bundle\") pod \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.766752 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-neutron-metadata-combined-ca-bundle\") pod \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.766788 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-nova-combined-ca-bundle\") pod \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\" (UID: \"522738de-cb3a-424d-ae01-b73bd3bcd8c6\") " Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.772622 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "522738de-cb3a-424d-ae01-b73bd3bcd8c6" (UID: "522738de-cb3a-424d-ae01-b73bd3bcd8c6"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.772706 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "522738de-cb3a-424d-ae01-b73bd3bcd8c6" (UID: "522738de-cb3a-424d-ae01-b73bd3bcd8c6"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.774691 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "522738de-cb3a-424d-ae01-b73bd3bcd8c6" (UID: "522738de-cb3a-424d-ae01-b73bd3bcd8c6"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.774784 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-kube-api-access-9c648" (OuterVolumeSpecName: "kube-api-access-9c648") pod "522738de-cb3a-424d-ae01-b73bd3bcd8c6" (UID: "522738de-cb3a-424d-ae01-b73bd3bcd8c6"). InnerVolumeSpecName "kube-api-access-9c648". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.774784 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "522738de-cb3a-424d-ae01-b73bd3bcd8c6" (UID: "522738de-cb3a-424d-ae01-b73bd3bcd8c6"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.774824 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "522738de-cb3a-424d-ae01-b73bd3bcd8c6" (UID: "522738de-cb3a-424d-ae01-b73bd3bcd8c6"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.775329 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "522738de-cb3a-424d-ae01-b73bd3bcd8c6" (UID: "522738de-cb3a-424d-ae01-b73bd3bcd8c6"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.775657 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "522738de-cb3a-424d-ae01-b73bd3bcd8c6" (UID: "522738de-cb3a-424d-ae01-b73bd3bcd8c6"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.776245 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "522738de-cb3a-424d-ae01-b73bd3bcd8c6" (UID: "522738de-cb3a-424d-ae01-b73bd3bcd8c6"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.777464 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "522738de-cb3a-424d-ae01-b73bd3bcd8c6" (UID: "522738de-cb3a-424d-ae01-b73bd3bcd8c6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.777924 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "522738de-cb3a-424d-ae01-b73bd3bcd8c6" (UID: "522738de-cb3a-424d-ae01-b73bd3bcd8c6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.778804 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "522738de-cb3a-424d-ae01-b73bd3bcd8c6" (UID: "522738de-cb3a-424d-ae01-b73bd3bcd8c6"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.799837 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "522738de-cb3a-424d-ae01-b73bd3bcd8c6" (UID: "522738de-cb3a-424d-ae01-b73bd3bcd8c6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.801289 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-inventory" (OuterVolumeSpecName: "inventory") pod "522738de-cb3a-424d-ae01-b73bd3bcd8c6" (UID: "522738de-cb3a-424d-ae01-b73bd3bcd8c6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.869095 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.869142 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.869158 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.869176 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.869188 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.869200 4743 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.869210 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.869222 4743 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.869235 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.869249 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.869263 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.869275 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.869287 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c648\" (UniqueName: \"kubernetes.io/projected/522738de-cb3a-424d-ae01-b73bd3bcd8c6-kube-api-access-9c648\") on node \"crc\" DevicePath \"\"" Nov 25 16:31:21 crc kubenswrapper[4743]: I1125 16:31:21.869298 4743 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522738de-cb3a-424d-ae01-b73bd3bcd8c6-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.282362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" event={"ID":"522738de-cb3a-424d-ae01-b73bd3bcd8c6","Type":"ContainerDied","Data":"a1c3236999bcb979b32c34ccf4c1774636ae6ca229a23a14214af70d3446729a"} Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.282404 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1c3236999bcb979b32c34ccf4c1774636ae6ca229a23a14214af70d3446729a" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.282444 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kf689" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.376019 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm"] Nov 25 16:31:22 crc kubenswrapper[4743]: E1125 16:31:22.376426 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="522738de-cb3a-424d-ae01-b73bd3bcd8c6" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.376448 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="522738de-cb3a-424d-ae01-b73bd3bcd8c6" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.376747 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="522738de-cb3a-424d-ae01-b73bd3bcd8c6" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.377305 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.382632 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.382632 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.383011 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.384766 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.385146 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.391020 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm"] Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.479074 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-75jlm\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.479133 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-75jlm\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.479173 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-75jlm\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.479288 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-75jlm\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.479323 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kpn7\" (UniqueName: \"kubernetes.io/projected/2073fba4-3e3f-4c49-ae69-265ffbc47f68-kube-api-access-5kpn7\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-75jlm\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.581398 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-75jlm\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.581671 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kpn7\" (UniqueName: \"kubernetes.io/projected/2073fba4-3e3f-4c49-ae69-265ffbc47f68-kube-api-access-5kpn7\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-75jlm\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.581763 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-75jlm\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.581816 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-75jlm\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.581861 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-75jlm\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.583165 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-75jlm\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.586166 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-75jlm\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.586513 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-75jlm\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.586870 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-75jlm\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.600138 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kpn7\" (UniqueName: \"kubernetes.io/projected/2073fba4-3e3f-4c49-ae69-265ffbc47f68-kube-api-access-5kpn7\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-75jlm\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:22 crc kubenswrapper[4743]: I1125 16:31:22.692512 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:31:23 crc kubenswrapper[4743]: I1125 16:31:23.198943 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm"] Nov 25 16:31:23 crc kubenswrapper[4743]: W1125 16:31:23.202421 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2073fba4_3e3f_4c49_ae69_265ffbc47f68.slice/crio-e772a258d3038f2741cdd93ebf9ae044b3d39afb66f826debf2033235b549f0b WatchSource:0}: Error finding container e772a258d3038f2741cdd93ebf9ae044b3d39afb66f826debf2033235b549f0b: Status 404 returned error can't find the container with id e772a258d3038f2741cdd93ebf9ae044b3d39afb66f826debf2033235b549f0b Nov 25 16:31:23 crc kubenswrapper[4743]: I1125 16:31:23.292304 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" event={"ID":"2073fba4-3e3f-4c49-ae69-265ffbc47f68","Type":"ContainerStarted","Data":"e772a258d3038f2741cdd93ebf9ae044b3d39afb66f826debf2033235b549f0b"} Nov 25 16:31:24 crc kubenswrapper[4743]: I1125 16:31:24.301886 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" event={"ID":"2073fba4-3e3f-4c49-ae69-265ffbc47f68","Type":"ContainerStarted","Data":"a4a1c22c89cd30d81d8a3dd870fc459431e30b66115395c3fb6dcdd25d65e957"} Nov 25 16:31:24 crc kubenswrapper[4743]: I1125 16:31:24.321139 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" podStartSLOduration=1.8802931809999999 podStartE2EDuration="2.321114996s" podCreationTimestamp="2025-11-25 16:31:22 +0000 UTC" firstStartedPulling="2025-11-25 16:31:23.20467243 +0000 UTC m=+1962.326511979" lastFinishedPulling="2025-11-25 16:31:23.645494245 +0000 UTC m=+1962.767333794" observedRunningTime="2025-11-25 16:31:24.315877 +0000 UTC m=+1963.437716569" watchObservedRunningTime="2025-11-25 16:31:24.321114996 +0000 UTC m=+1963.442954545" Nov 25 16:32:20 crc kubenswrapper[4743]: I1125 16:32:20.077770 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:32:20 crc kubenswrapper[4743]: I1125 16:32:20.078308 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:32:25 crc kubenswrapper[4743]: I1125 16:32:25.808526 4743 generic.go:334] "Generic (PLEG): container finished" podID="2073fba4-3e3f-4c49-ae69-265ffbc47f68" containerID="a4a1c22c89cd30d81d8a3dd870fc459431e30b66115395c3fb6dcdd25d65e957" exitCode=0 Nov 25 16:32:25 crc kubenswrapper[4743]: I1125 16:32:25.808608 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" event={"ID":"2073fba4-3e3f-4c49-ae69-265ffbc47f68","Type":"ContainerDied","Data":"a4a1c22c89cd30d81d8a3dd870fc459431e30b66115395c3fb6dcdd25d65e957"} Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.202756 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.295966 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ovn-combined-ca-bundle\") pod \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.296021 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-inventory\") pod \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.296142 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kpn7\" (UniqueName: \"kubernetes.io/projected/2073fba4-3e3f-4c49-ae69-265ffbc47f68-kube-api-access-5kpn7\") pod \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.296170 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ssh-key\") pod \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.296196 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ovncontroller-config-0\") pod \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\" (UID: \"2073fba4-3e3f-4c49-ae69-265ffbc47f68\") " Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.302298 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2073fba4-3e3f-4c49-ae69-265ffbc47f68-kube-api-access-5kpn7" (OuterVolumeSpecName: "kube-api-access-5kpn7") pod "2073fba4-3e3f-4c49-ae69-265ffbc47f68" (UID: "2073fba4-3e3f-4c49-ae69-265ffbc47f68"). InnerVolumeSpecName "kube-api-access-5kpn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.302552 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "2073fba4-3e3f-4c49-ae69-265ffbc47f68" (UID: "2073fba4-3e3f-4c49-ae69-265ffbc47f68"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.322228 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "2073fba4-3e3f-4c49-ae69-265ffbc47f68" (UID: "2073fba4-3e3f-4c49-ae69-265ffbc47f68"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.323735 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2073fba4-3e3f-4c49-ae69-265ffbc47f68" (UID: "2073fba4-3e3f-4c49-ae69-265ffbc47f68"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.331648 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-inventory" (OuterVolumeSpecName: "inventory") pod "2073fba4-3e3f-4c49-ae69-265ffbc47f68" (UID: "2073fba4-3e3f-4c49-ae69-265ffbc47f68"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.398732 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.398764 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.398774 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kpn7\" (UniqueName: \"kubernetes.io/projected/2073fba4-3e3f-4c49-ae69-265ffbc47f68-kube-api-access-5kpn7\") on node \"crc\" DevicePath \"\"" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.398783 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.398791 4743 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2073fba4-3e3f-4c49-ae69-265ffbc47f68-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.825219 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" event={"ID":"2073fba4-3e3f-4c49-ae69-265ffbc47f68","Type":"ContainerDied","Data":"e772a258d3038f2741cdd93ebf9ae044b3d39afb66f826debf2033235b549f0b"} Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.825264 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e772a258d3038f2741cdd93ebf9ae044b3d39afb66f826debf2033235b549f0b" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.825268 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-75jlm" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.908810 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv"] Nov 25 16:32:27 crc kubenswrapper[4743]: E1125 16:32:27.909281 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2073fba4-3e3f-4c49-ae69-265ffbc47f68" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.909306 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2073fba4-3e3f-4c49-ae69-265ffbc47f68" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.909554 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2073fba4-3e3f-4c49-ae69-265ffbc47f68" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.911073 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.918504 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv"] Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.920542 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.920558 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.920753 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.920871 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.920940 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 25 16:32:27 crc kubenswrapper[4743]: I1125 16:32:27.924379 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.008753 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.008848 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.008902 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.008923 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqdg4\" (UniqueName: \"kubernetes.io/projected/389b43ba-821f-48b6-b924-46ddda4e2d11-kube-api-access-bqdg4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.008985 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.009013 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.110389 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.110456 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.110505 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.110531 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqdg4\" (UniqueName: \"kubernetes.io/projected/389b43ba-821f-48b6-b924-46ddda4e2d11-kube-api-access-bqdg4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.110556 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.110583 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.115248 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.115271 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.115433 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.115565 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.116437 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.128405 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqdg4\" (UniqueName: \"kubernetes.io/projected/389b43ba-821f-48b6-b924-46ddda4e2d11-kube-api-access-bqdg4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.227095 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.731462 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv"] Nov 25 16:32:28 crc kubenswrapper[4743]: I1125 16:32:28.834236 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" event={"ID":"389b43ba-821f-48b6-b924-46ddda4e2d11","Type":"ContainerStarted","Data":"213e7d0f63072cfed5810b08360e0dc133378a4df9623c77633b3eea7de4bfd1"} Nov 25 16:32:29 crc kubenswrapper[4743]: I1125 16:32:29.848756 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" event={"ID":"389b43ba-821f-48b6-b924-46ddda4e2d11","Type":"ContainerStarted","Data":"725a0ee684e390ed233e593579c458d767c453fd7ff02b1d846ba93256aa3337"} Nov 25 16:32:29 crc kubenswrapper[4743]: I1125 16:32:29.867499 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" podStartSLOduration=2.284491141 podStartE2EDuration="2.867439379s" podCreationTimestamp="2025-11-25 16:32:27 +0000 UTC" firstStartedPulling="2025-11-25 16:32:28.736975421 +0000 UTC m=+2027.858814960" lastFinishedPulling="2025-11-25 16:32:29.319923649 +0000 UTC m=+2028.441763198" observedRunningTime="2025-11-25 16:32:29.865157357 +0000 UTC m=+2028.986996926" watchObservedRunningTime="2025-11-25 16:32:29.867439379 +0000 UTC m=+2028.989278928" Nov 25 16:32:50 crc kubenswrapper[4743]: I1125 16:32:50.077549 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:32:50 crc kubenswrapper[4743]: I1125 16:32:50.078097 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:33:20 crc kubenswrapper[4743]: I1125 16:33:20.077010 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:33:20 crc kubenswrapper[4743]: I1125 16:33:20.077513 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:33:20 crc kubenswrapper[4743]: I1125 16:33:20.077567 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 16:33:20 crc kubenswrapper[4743]: I1125 16:33:20.078324 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"61ddd1cf4766f7f9fdf9d1bbccdeb4e1e763abd124bcc4fda1e5e4965acde9ac"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:33:20 crc kubenswrapper[4743]: I1125 16:33:20.078390 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://61ddd1cf4766f7f9fdf9d1bbccdeb4e1e763abd124bcc4fda1e5e4965acde9ac" gracePeriod=600 Nov 25 16:33:20 crc kubenswrapper[4743]: E1125 16:33:20.221053 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73c29847_f70f_4ab1_9691_685966384446.slice/crio-61ddd1cf4766f7f9fdf9d1bbccdeb4e1e763abd124bcc4fda1e5e4965acde9ac.scope\": RecentStats: unable to find data in memory cache]" Nov 25 16:33:20 crc kubenswrapper[4743]: I1125 16:33:20.252966 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="61ddd1cf4766f7f9fdf9d1bbccdeb4e1e763abd124bcc4fda1e5e4965acde9ac" exitCode=0 Nov 25 16:33:20 crc kubenswrapper[4743]: I1125 16:33:20.253227 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"61ddd1cf4766f7f9fdf9d1bbccdeb4e1e763abd124bcc4fda1e5e4965acde9ac"} Nov 25 16:33:20 crc kubenswrapper[4743]: I1125 16:33:20.253259 4743 scope.go:117] "RemoveContainer" containerID="f97f50990e3f4eca81420fdbf293f90a09303a7ca42b07e2fcb29790d797c806" Nov 25 16:33:20 crc kubenswrapper[4743]: I1125 16:33:20.255011 4743 generic.go:334] "Generic (PLEG): container finished" podID="389b43ba-821f-48b6-b924-46ddda4e2d11" containerID="725a0ee684e390ed233e593579c458d767c453fd7ff02b1d846ba93256aa3337" exitCode=0 Nov 25 16:33:20 crc kubenswrapper[4743]: I1125 16:33:20.255033 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" event={"ID":"389b43ba-821f-48b6-b924-46ddda4e2d11","Type":"ContainerDied","Data":"725a0ee684e390ed233e593579c458d767c453fd7ff02b1d846ba93256aa3337"} Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.265470 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964"} Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.652275 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.697282 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqdg4\" (UniqueName: \"kubernetes.io/projected/389b43ba-821f-48b6-b924-46ddda4e2d11-kube-api-access-bqdg4\") pod \"389b43ba-821f-48b6-b924-46ddda4e2d11\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.697360 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-neutron-ovn-metadata-agent-neutron-config-0\") pod \"389b43ba-821f-48b6-b924-46ddda4e2d11\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.697385 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-neutron-metadata-combined-ca-bundle\") pod \"389b43ba-821f-48b6-b924-46ddda4e2d11\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.697459 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-nova-metadata-neutron-config-0\") pod \"389b43ba-821f-48b6-b924-46ddda4e2d11\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.697567 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-ssh-key\") pod \"389b43ba-821f-48b6-b924-46ddda4e2d11\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.698064 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-inventory\") pod \"389b43ba-821f-48b6-b924-46ddda4e2d11\" (UID: \"389b43ba-821f-48b6-b924-46ddda4e2d11\") " Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.704322 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/389b43ba-821f-48b6-b924-46ddda4e2d11-kube-api-access-bqdg4" (OuterVolumeSpecName: "kube-api-access-bqdg4") pod "389b43ba-821f-48b6-b924-46ddda4e2d11" (UID: "389b43ba-821f-48b6-b924-46ddda4e2d11"). InnerVolumeSpecName "kube-api-access-bqdg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.706032 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "389b43ba-821f-48b6-b924-46ddda4e2d11" (UID: "389b43ba-821f-48b6-b924-46ddda4e2d11"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.726827 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-inventory" (OuterVolumeSpecName: "inventory") pod "389b43ba-821f-48b6-b924-46ddda4e2d11" (UID: "389b43ba-821f-48b6-b924-46ddda4e2d11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.729057 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "389b43ba-821f-48b6-b924-46ddda4e2d11" (UID: "389b43ba-821f-48b6-b924-46ddda4e2d11"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.729521 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "389b43ba-821f-48b6-b924-46ddda4e2d11" (UID: "389b43ba-821f-48b6-b924-46ddda4e2d11"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.736984 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "389b43ba-821f-48b6-b924-46ddda4e2d11" (UID: "389b43ba-821f-48b6-b924-46ddda4e2d11"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.805276 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.805322 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqdg4\" (UniqueName: \"kubernetes.io/projected/389b43ba-821f-48b6-b924-46ddda4e2d11-kube-api-access-bqdg4\") on node \"crc\" DevicePath \"\"" Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.805339 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.805354 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.805370 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:33:21 crc kubenswrapper[4743]: I1125 16:33:21.805378 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/389b43ba-821f-48b6-b924-46ddda4e2d11-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.275066 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" event={"ID":"389b43ba-821f-48b6-b924-46ddda4e2d11","Type":"ContainerDied","Data":"213e7d0f63072cfed5810b08360e0dc133378a4df9623c77633b3eea7de4bfd1"} Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.275108 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="213e7d0f63072cfed5810b08360e0dc133378a4df9623c77633b3eea7de4bfd1" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.275923 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.371777 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6"] Nov 25 16:33:22 crc kubenswrapper[4743]: E1125 16:33:22.372175 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389b43ba-821f-48b6-b924-46ddda4e2d11" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.372192 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="389b43ba-821f-48b6-b924-46ddda4e2d11" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.372386 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="389b43ba-821f-48b6-b924-46ddda4e2d11" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.372976 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.374959 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.375844 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.378011 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.378278 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.378472 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.387610 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6"] Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.517701 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.517942 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-794h5\" (UniqueName: \"kubernetes.io/projected/7568caf6-7fa3-429a-90f2-40cbd4dece9d-kube-api-access-794h5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.518025 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.518112 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.518191 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.620492 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-794h5\" (UniqueName: \"kubernetes.io/projected/7568caf6-7fa3-429a-90f2-40cbd4dece9d-kube-api-access-794h5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.620549 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.620608 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.620657 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.620767 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.626269 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.626509 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.627284 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.627684 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.641489 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-794h5\" (UniqueName: \"kubernetes.io/projected/7568caf6-7fa3-429a-90f2-40cbd4dece9d-kube-api-access-794h5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:22 crc kubenswrapper[4743]: I1125 16:33:22.691450 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:33:23 crc kubenswrapper[4743]: I1125 16:33:23.028325 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6"] Nov 25 16:33:23 crc kubenswrapper[4743]: W1125 16:33:23.031795 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7568caf6_7fa3_429a_90f2_40cbd4dece9d.slice/crio-d3064ade5b850d6053079117fec7e35b6935a213f1f2ead92564baccab245c02 WatchSource:0}: Error finding container d3064ade5b850d6053079117fec7e35b6935a213f1f2ead92564baccab245c02: Status 404 returned error can't find the container with id d3064ade5b850d6053079117fec7e35b6935a213f1f2ead92564baccab245c02 Nov 25 16:33:23 crc kubenswrapper[4743]: I1125 16:33:23.317146 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" event={"ID":"7568caf6-7fa3-429a-90f2-40cbd4dece9d","Type":"ContainerStarted","Data":"d3064ade5b850d6053079117fec7e35b6935a213f1f2ead92564baccab245c02"} Nov 25 16:33:26 crc kubenswrapper[4743]: I1125 16:33:26.342263 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" event={"ID":"7568caf6-7fa3-429a-90f2-40cbd4dece9d","Type":"ContainerStarted","Data":"d4d94659818210ed0c5147479ecb9884a7ac056d21fe97f0c1bf37bb79904702"} Nov 25 16:33:26 crc kubenswrapper[4743]: I1125 16:33:26.362750 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" podStartSLOduration=2.431550982 podStartE2EDuration="4.362730187s" podCreationTimestamp="2025-11-25 16:33:22 +0000 UTC" firstStartedPulling="2025-11-25 16:33:23.034393571 +0000 UTC m=+2082.156233120" lastFinishedPulling="2025-11-25 16:33:24.965572766 +0000 UTC m=+2084.087412325" observedRunningTime="2025-11-25 16:33:26.357397548 +0000 UTC m=+2085.479237107" watchObservedRunningTime="2025-11-25 16:33:26.362730187 +0000 UTC m=+2085.484569736" Nov 25 16:33:59 crc kubenswrapper[4743]: I1125 16:33:59.947527 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-77hwc"] Nov 25 16:33:59 crc kubenswrapper[4743]: I1125 16:33:59.950698 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:33:59 crc kubenswrapper[4743]: I1125 16:33:59.961903 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-77hwc"] Nov 25 16:34:00 crc kubenswrapper[4743]: I1125 16:34:00.070354 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcmpk\" (UniqueName: \"kubernetes.io/projected/96081602-4be1-4c48-9279-540358cc8d79-kube-api-access-pcmpk\") pod \"community-operators-77hwc\" (UID: \"96081602-4be1-4c48-9279-540358cc8d79\") " pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:00 crc kubenswrapper[4743]: I1125 16:34:00.070787 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96081602-4be1-4c48-9279-540358cc8d79-utilities\") pod \"community-operators-77hwc\" (UID: \"96081602-4be1-4c48-9279-540358cc8d79\") " pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:00 crc kubenswrapper[4743]: I1125 16:34:00.071139 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96081602-4be1-4c48-9279-540358cc8d79-catalog-content\") pod \"community-operators-77hwc\" (UID: \"96081602-4be1-4c48-9279-540358cc8d79\") " pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:00 crc kubenswrapper[4743]: I1125 16:34:00.173255 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96081602-4be1-4c48-9279-540358cc8d79-catalog-content\") pod \"community-operators-77hwc\" (UID: \"96081602-4be1-4c48-9279-540358cc8d79\") " pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:00 crc kubenswrapper[4743]: I1125 16:34:00.173405 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcmpk\" (UniqueName: \"kubernetes.io/projected/96081602-4be1-4c48-9279-540358cc8d79-kube-api-access-pcmpk\") pod \"community-operators-77hwc\" (UID: \"96081602-4be1-4c48-9279-540358cc8d79\") " pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:00 crc kubenswrapper[4743]: I1125 16:34:00.173438 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96081602-4be1-4c48-9279-540358cc8d79-utilities\") pod \"community-operators-77hwc\" (UID: \"96081602-4be1-4c48-9279-540358cc8d79\") " pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:00 crc kubenswrapper[4743]: I1125 16:34:00.174383 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96081602-4be1-4c48-9279-540358cc8d79-utilities\") pod \"community-operators-77hwc\" (UID: \"96081602-4be1-4c48-9279-540358cc8d79\") " pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:00 crc kubenswrapper[4743]: I1125 16:34:00.174540 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96081602-4be1-4c48-9279-540358cc8d79-catalog-content\") pod \"community-operators-77hwc\" (UID: \"96081602-4be1-4c48-9279-540358cc8d79\") " pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:00 crc kubenswrapper[4743]: I1125 16:34:00.193556 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcmpk\" (UniqueName: \"kubernetes.io/projected/96081602-4be1-4c48-9279-540358cc8d79-kube-api-access-pcmpk\") pod \"community-operators-77hwc\" (UID: \"96081602-4be1-4c48-9279-540358cc8d79\") " pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:00 crc kubenswrapper[4743]: I1125 16:34:00.271755 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:00 crc kubenswrapper[4743]: I1125 16:34:00.784458 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-77hwc"] Nov 25 16:34:01 crc kubenswrapper[4743]: E1125 16:34:01.148694 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96081602_4be1_4c48_9279_540358cc8d79.slice/crio-4e53c191129530caeef883888b4b0dd025c3c9411c256d1698e432252f3ed35e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96081602_4be1_4c48_9279_540358cc8d79.slice/crio-conmon-4e53c191129530caeef883888b4b0dd025c3c9411c256d1698e432252f3ed35e.scope\": RecentStats: unable to find data in memory cache]" Nov 25 16:34:01 crc kubenswrapper[4743]: I1125 16:34:01.653820 4743 generic.go:334] "Generic (PLEG): container finished" podID="96081602-4be1-4c48-9279-540358cc8d79" containerID="4e53c191129530caeef883888b4b0dd025c3c9411c256d1698e432252f3ed35e" exitCode=0 Nov 25 16:34:01 crc kubenswrapper[4743]: I1125 16:34:01.654035 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77hwc" event={"ID":"96081602-4be1-4c48-9279-540358cc8d79","Type":"ContainerDied","Data":"4e53c191129530caeef883888b4b0dd025c3c9411c256d1698e432252f3ed35e"} Nov 25 16:34:01 crc kubenswrapper[4743]: I1125 16:34:01.654411 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77hwc" event={"ID":"96081602-4be1-4c48-9279-540358cc8d79","Type":"ContainerStarted","Data":"74a229e65115059544d23958d198ff83743977c2ca000240e61b42469ef9b8b1"} Nov 25 16:34:02 crc kubenswrapper[4743]: I1125 16:34:02.664933 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77hwc" event={"ID":"96081602-4be1-4c48-9279-540358cc8d79","Type":"ContainerStarted","Data":"e5d3e7dfe47ace28c1443f771f4d38711f457208c4b6d5c6cae6e8af7f721aca"} Nov 25 16:34:03 crc kubenswrapper[4743]: I1125 16:34:03.691082 4743 generic.go:334] "Generic (PLEG): container finished" podID="96081602-4be1-4c48-9279-540358cc8d79" containerID="e5d3e7dfe47ace28c1443f771f4d38711f457208c4b6d5c6cae6e8af7f721aca" exitCode=0 Nov 25 16:34:03 crc kubenswrapper[4743]: I1125 16:34:03.691544 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77hwc" event={"ID":"96081602-4be1-4c48-9279-540358cc8d79","Type":"ContainerDied","Data":"e5d3e7dfe47ace28c1443f771f4d38711f457208c4b6d5c6cae6e8af7f721aca"} Nov 25 16:34:04 crc kubenswrapper[4743]: I1125 16:34:04.703172 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77hwc" event={"ID":"96081602-4be1-4c48-9279-540358cc8d79","Type":"ContainerStarted","Data":"30ff7f4201f54adb14b64de28e07cf2a73fd1a91f05fa343279ab59623ca979f"} Nov 25 16:34:04 crc kubenswrapper[4743]: I1125 16:34:04.728860 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-77hwc" podStartSLOduration=3.321373571 podStartE2EDuration="5.72884241s" podCreationTimestamp="2025-11-25 16:33:59 +0000 UTC" firstStartedPulling="2025-11-25 16:34:01.656221171 +0000 UTC m=+2120.778060720" lastFinishedPulling="2025-11-25 16:34:04.06369001 +0000 UTC m=+2123.185529559" observedRunningTime="2025-11-25 16:34:04.719927499 +0000 UTC m=+2123.841767048" watchObservedRunningTime="2025-11-25 16:34:04.72884241 +0000 UTC m=+2123.850681959" Nov 25 16:34:10 crc kubenswrapper[4743]: I1125 16:34:10.271899 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:10 crc kubenswrapper[4743]: I1125 16:34:10.272211 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:10 crc kubenswrapper[4743]: I1125 16:34:10.314344 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:10 crc kubenswrapper[4743]: I1125 16:34:10.797383 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:10 crc kubenswrapper[4743]: I1125 16:34:10.848612 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-77hwc"] Nov 25 16:34:12 crc kubenswrapper[4743]: I1125 16:34:12.763051 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-77hwc" podUID="96081602-4be1-4c48-9279-540358cc8d79" containerName="registry-server" containerID="cri-o://30ff7f4201f54adb14b64de28e07cf2a73fd1a91f05fa343279ab59623ca979f" gracePeriod=2 Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.250854 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.407522 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96081602-4be1-4c48-9279-540358cc8d79-utilities\") pod \"96081602-4be1-4c48-9279-540358cc8d79\" (UID: \"96081602-4be1-4c48-9279-540358cc8d79\") " Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.407710 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96081602-4be1-4c48-9279-540358cc8d79-catalog-content\") pod \"96081602-4be1-4c48-9279-540358cc8d79\" (UID: \"96081602-4be1-4c48-9279-540358cc8d79\") " Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.407785 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcmpk\" (UniqueName: \"kubernetes.io/projected/96081602-4be1-4c48-9279-540358cc8d79-kube-api-access-pcmpk\") pod \"96081602-4be1-4c48-9279-540358cc8d79\" (UID: \"96081602-4be1-4c48-9279-540358cc8d79\") " Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.408479 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96081602-4be1-4c48-9279-540358cc8d79-utilities" (OuterVolumeSpecName: "utilities") pod "96081602-4be1-4c48-9279-540358cc8d79" (UID: "96081602-4be1-4c48-9279-540358cc8d79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.413999 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96081602-4be1-4c48-9279-540358cc8d79-kube-api-access-pcmpk" (OuterVolumeSpecName: "kube-api-access-pcmpk") pod "96081602-4be1-4c48-9279-540358cc8d79" (UID: "96081602-4be1-4c48-9279-540358cc8d79"). InnerVolumeSpecName "kube-api-access-pcmpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.452193 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96081602-4be1-4c48-9279-540358cc8d79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96081602-4be1-4c48-9279-540358cc8d79" (UID: "96081602-4be1-4c48-9279-540358cc8d79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.509943 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96081602-4be1-4c48-9279-540358cc8d79-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.509980 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96081602-4be1-4c48-9279-540358cc8d79-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.509992 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcmpk\" (UniqueName: \"kubernetes.io/projected/96081602-4be1-4c48-9279-540358cc8d79-kube-api-access-pcmpk\") on node \"crc\" DevicePath \"\"" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.774553 4743 generic.go:334] "Generic (PLEG): container finished" podID="96081602-4be1-4c48-9279-540358cc8d79" containerID="30ff7f4201f54adb14b64de28e07cf2a73fd1a91f05fa343279ab59623ca979f" exitCode=0 Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.774649 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77hwc" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.785244 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77hwc" event={"ID":"96081602-4be1-4c48-9279-540358cc8d79","Type":"ContainerDied","Data":"30ff7f4201f54adb14b64de28e07cf2a73fd1a91f05fa343279ab59623ca979f"} Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.785337 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77hwc" event={"ID":"96081602-4be1-4c48-9279-540358cc8d79","Type":"ContainerDied","Data":"74a229e65115059544d23958d198ff83743977c2ca000240e61b42469ef9b8b1"} Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.785364 4743 scope.go:117] "RemoveContainer" containerID="30ff7f4201f54adb14b64de28e07cf2a73fd1a91f05fa343279ab59623ca979f" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.816057 4743 scope.go:117] "RemoveContainer" containerID="e5d3e7dfe47ace28c1443f771f4d38711f457208c4b6d5c6cae6e8af7f721aca" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.817195 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-77hwc"] Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.824919 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-77hwc"] Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.837520 4743 scope.go:117] "RemoveContainer" containerID="4e53c191129530caeef883888b4b0dd025c3c9411c256d1698e432252f3ed35e" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.887744 4743 scope.go:117] "RemoveContainer" containerID="30ff7f4201f54adb14b64de28e07cf2a73fd1a91f05fa343279ab59623ca979f" Nov 25 16:34:13 crc kubenswrapper[4743]: E1125 16:34:13.888341 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30ff7f4201f54adb14b64de28e07cf2a73fd1a91f05fa343279ab59623ca979f\": container with ID starting with 30ff7f4201f54adb14b64de28e07cf2a73fd1a91f05fa343279ab59623ca979f not found: ID does not exist" containerID="30ff7f4201f54adb14b64de28e07cf2a73fd1a91f05fa343279ab59623ca979f" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.888441 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ff7f4201f54adb14b64de28e07cf2a73fd1a91f05fa343279ab59623ca979f"} err="failed to get container status \"30ff7f4201f54adb14b64de28e07cf2a73fd1a91f05fa343279ab59623ca979f\": rpc error: code = NotFound desc = could not find container \"30ff7f4201f54adb14b64de28e07cf2a73fd1a91f05fa343279ab59623ca979f\": container with ID starting with 30ff7f4201f54adb14b64de28e07cf2a73fd1a91f05fa343279ab59623ca979f not found: ID does not exist" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.888514 4743 scope.go:117] "RemoveContainer" containerID="e5d3e7dfe47ace28c1443f771f4d38711f457208c4b6d5c6cae6e8af7f721aca" Nov 25 16:34:13 crc kubenswrapper[4743]: E1125 16:34:13.888837 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5d3e7dfe47ace28c1443f771f4d38711f457208c4b6d5c6cae6e8af7f721aca\": container with ID starting with e5d3e7dfe47ace28c1443f771f4d38711f457208c4b6d5c6cae6e8af7f721aca not found: ID does not exist" containerID="e5d3e7dfe47ace28c1443f771f4d38711f457208c4b6d5c6cae6e8af7f721aca" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.888864 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5d3e7dfe47ace28c1443f771f4d38711f457208c4b6d5c6cae6e8af7f721aca"} err="failed to get container status \"e5d3e7dfe47ace28c1443f771f4d38711f457208c4b6d5c6cae6e8af7f721aca\": rpc error: code = NotFound desc = could not find container \"e5d3e7dfe47ace28c1443f771f4d38711f457208c4b6d5c6cae6e8af7f721aca\": container with ID starting with e5d3e7dfe47ace28c1443f771f4d38711f457208c4b6d5c6cae6e8af7f721aca not found: ID does not exist" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.888883 4743 scope.go:117] "RemoveContainer" containerID="4e53c191129530caeef883888b4b0dd025c3c9411c256d1698e432252f3ed35e" Nov 25 16:34:13 crc kubenswrapper[4743]: E1125 16:34:13.889227 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e53c191129530caeef883888b4b0dd025c3c9411c256d1698e432252f3ed35e\": container with ID starting with 4e53c191129530caeef883888b4b0dd025c3c9411c256d1698e432252f3ed35e not found: ID does not exist" containerID="4e53c191129530caeef883888b4b0dd025c3c9411c256d1698e432252f3ed35e" Nov 25 16:34:13 crc kubenswrapper[4743]: I1125 16:34:13.889341 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e53c191129530caeef883888b4b0dd025c3c9411c256d1698e432252f3ed35e"} err="failed to get container status \"4e53c191129530caeef883888b4b0dd025c3c9411c256d1698e432252f3ed35e\": rpc error: code = NotFound desc = could not find container \"4e53c191129530caeef883888b4b0dd025c3c9411c256d1698e432252f3ed35e\": container with ID starting with 4e53c191129530caeef883888b4b0dd025c3c9411c256d1698e432252f3ed35e not found: ID does not exist" Nov 25 16:34:15 crc kubenswrapper[4743]: I1125 16:34:15.785058 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96081602-4be1-4c48-9279-540358cc8d79" path="/var/lib/kubelet/pods/96081602-4be1-4c48-9279-540358cc8d79/volumes" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.268452 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c2x8f"] Nov 25 16:34:34 crc kubenswrapper[4743]: E1125 16:34:34.269500 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96081602-4be1-4c48-9279-540358cc8d79" containerName="registry-server" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.269518 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="96081602-4be1-4c48-9279-540358cc8d79" containerName="registry-server" Nov 25 16:34:34 crc kubenswrapper[4743]: E1125 16:34:34.269555 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96081602-4be1-4c48-9279-540358cc8d79" containerName="extract-content" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.269564 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="96081602-4be1-4c48-9279-540358cc8d79" containerName="extract-content" Nov 25 16:34:34 crc kubenswrapper[4743]: E1125 16:34:34.269615 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96081602-4be1-4c48-9279-540358cc8d79" containerName="extract-utilities" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.269625 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="96081602-4be1-4c48-9279-540358cc8d79" containerName="extract-utilities" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.269861 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="96081602-4be1-4c48-9279-540358cc8d79" containerName="registry-server" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.271674 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.283990 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c2x8f"] Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.377873 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6354099b-c55f-4913-bab2-d5d4858a3372-utilities\") pod \"certified-operators-c2x8f\" (UID: \"6354099b-c55f-4913-bab2-d5d4858a3372\") " pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.378230 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6354099b-c55f-4913-bab2-d5d4858a3372-catalog-content\") pod \"certified-operators-c2x8f\" (UID: \"6354099b-c55f-4913-bab2-d5d4858a3372\") " pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.378277 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvgs2\" (UniqueName: \"kubernetes.io/projected/6354099b-c55f-4913-bab2-d5d4858a3372-kube-api-access-dvgs2\") pod \"certified-operators-c2x8f\" (UID: \"6354099b-c55f-4913-bab2-d5d4858a3372\") " pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.480052 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6354099b-c55f-4913-bab2-d5d4858a3372-utilities\") pod \"certified-operators-c2x8f\" (UID: \"6354099b-c55f-4913-bab2-d5d4858a3372\") " pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.480127 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6354099b-c55f-4913-bab2-d5d4858a3372-catalog-content\") pod \"certified-operators-c2x8f\" (UID: \"6354099b-c55f-4913-bab2-d5d4858a3372\") " pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.480156 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvgs2\" (UniqueName: \"kubernetes.io/projected/6354099b-c55f-4913-bab2-d5d4858a3372-kube-api-access-dvgs2\") pod \"certified-operators-c2x8f\" (UID: \"6354099b-c55f-4913-bab2-d5d4858a3372\") " pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.480576 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6354099b-c55f-4913-bab2-d5d4858a3372-utilities\") pod \"certified-operators-c2x8f\" (UID: \"6354099b-c55f-4913-bab2-d5d4858a3372\") " pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.480685 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6354099b-c55f-4913-bab2-d5d4858a3372-catalog-content\") pod \"certified-operators-c2x8f\" (UID: \"6354099b-c55f-4913-bab2-d5d4858a3372\") " pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.503501 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvgs2\" (UniqueName: \"kubernetes.io/projected/6354099b-c55f-4913-bab2-d5d4858a3372-kube-api-access-dvgs2\") pod \"certified-operators-c2x8f\" (UID: \"6354099b-c55f-4913-bab2-d5d4858a3372\") " pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:34 crc kubenswrapper[4743]: I1125 16:34:34.604499 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:35 crc kubenswrapper[4743]: I1125 16:34:35.117056 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c2x8f"] Nov 25 16:34:35 crc kubenswrapper[4743]: I1125 16:34:35.956299 4743 generic.go:334] "Generic (PLEG): container finished" podID="6354099b-c55f-4913-bab2-d5d4858a3372" containerID="28ba48b019d65acd34c11e59e4bcc9c96c164bcf6125556468d8dce0aa3d21b2" exitCode=0 Nov 25 16:34:35 crc kubenswrapper[4743]: I1125 16:34:35.956362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2x8f" event={"ID":"6354099b-c55f-4913-bab2-d5d4858a3372","Type":"ContainerDied","Data":"28ba48b019d65acd34c11e59e4bcc9c96c164bcf6125556468d8dce0aa3d21b2"} Nov 25 16:34:35 crc kubenswrapper[4743]: I1125 16:34:35.956402 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2x8f" event={"ID":"6354099b-c55f-4913-bab2-d5d4858a3372","Type":"ContainerStarted","Data":"e187f24aaa3828b3fff2aec0878c2f231d2af1cccb2f33b5af50c3cf1bbb7293"} Nov 25 16:34:36 crc kubenswrapper[4743]: I1125 16:34:36.967226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2x8f" event={"ID":"6354099b-c55f-4913-bab2-d5d4858a3372","Type":"ContainerStarted","Data":"39abb3d39795466ce4127f8ec2a9a43873ecb5fa81e84ba191a55a83cdf32574"} Nov 25 16:34:37 crc kubenswrapper[4743]: I1125 16:34:37.977229 4743 generic.go:334] "Generic (PLEG): container finished" podID="6354099b-c55f-4913-bab2-d5d4858a3372" containerID="39abb3d39795466ce4127f8ec2a9a43873ecb5fa81e84ba191a55a83cdf32574" exitCode=0 Nov 25 16:34:37 crc kubenswrapper[4743]: I1125 16:34:37.977332 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2x8f" event={"ID":"6354099b-c55f-4913-bab2-d5d4858a3372","Type":"ContainerDied","Data":"39abb3d39795466ce4127f8ec2a9a43873ecb5fa81e84ba191a55a83cdf32574"} Nov 25 16:34:37 crc kubenswrapper[4743]: I1125 16:34:37.978488 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2x8f" event={"ID":"6354099b-c55f-4913-bab2-d5d4858a3372","Type":"ContainerStarted","Data":"1a38b3c98ef71abc291042adfe1b24722ed9401a421c802e7a7a640c794d57f5"} Nov 25 16:34:37 crc kubenswrapper[4743]: I1125 16:34:37.994665 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c2x8f" podStartSLOduration=2.532223928 podStartE2EDuration="3.994646532s" podCreationTimestamp="2025-11-25 16:34:34 +0000 UTC" firstStartedPulling="2025-11-25 16:34:35.958665964 +0000 UTC m=+2155.080505513" lastFinishedPulling="2025-11-25 16:34:37.421088568 +0000 UTC m=+2156.542928117" observedRunningTime="2025-11-25 16:34:37.993697901 +0000 UTC m=+2157.115537470" watchObservedRunningTime="2025-11-25 16:34:37.994646532 +0000 UTC m=+2157.116486081" Nov 25 16:34:44 crc kubenswrapper[4743]: I1125 16:34:44.605560 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:44 crc kubenswrapper[4743]: I1125 16:34:44.606013 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:44 crc kubenswrapper[4743]: I1125 16:34:44.653661 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:45 crc kubenswrapper[4743]: I1125 16:34:45.094244 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:45 crc kubenswrapper[4743]: I1125 16:34:45.137723 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c2x8f"] Nov 25 16:34:47 crc kubenswrapper[4743]: I1125 16:34:47.064135 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c2x8f" podUID="6354099b-c55f-4913-bab2-d5d4858a3372" containerName="registry-server" containerID="cri-o://1a38b3c98ef71abc291042adfe1b24722ed9401a421c802e7a7a640c794d57f5" gracePeriod=2 Nov 25 16:34:47 crc kubenswrapper[4743]: I1125 16:34:47.544536 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:47 crc kubenswrapper[4743]: I1125 16:34:47.610545 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6354099b-c55f-4913-bab2-d5d4858a3372-catalog-content\") pod \"6354099b-c55f-4913-bab2-d5d4858a3372\" (UID: \"6354099b-c55f-4913-bab2-d5d4858a3372\") " Nov 25 16:34:47 crc kubenswrapper[4743]: I1125 16:34:47.610629 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvgs2\" (UniqueName: \"kubernetes.io/projected/6354099b-c55f-4913-bab2-d5d4858a3372-kube-api-access-dvgs2\") pod \"6354099b-c55f-4913-bab2-d5d4858a3372\" (UID: \"6354099b-c55f-4913-bab2-d5d4858a3372\") " Nov 25 16:34:47 crc kubenswrapper[4743]: I1125 16:34:47.610698 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6354099b-c55f-4913-bab2-d5d4858a3372-utilities\") pod \"6354099b-c55f-4913-bab2-d5d4858a3372\" (UID: \"6354099b-c55f-4913-bab2-d5d4858a3372\") " Nov 25 16:34:47 crc kubenswrapper[4743]: I1125 16:34:47.612253 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6354099b-c55f-4913-bab2-d5d4858a3372-utilities" (OuterVolumeSpecName: "utilities") pod "6354099b-c55f-4913-bab2-d5d4858a3372" (UID: "6354099b-c55f-4913-bab2-d5d4858a3372"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:34:47 crc kubenswrapper[4743]: I1125 16:34:47.625887 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6354099b-c55f-4913-bab2-d5d4858a3372-kube-api-access-dvgs2" (OuterVolumeSpecName: "kube-api-access-dvgs2") pod "6354099b-c55f-4913-bab2-d5d4858a3372" (UID: "6354099b-c55f-4913-bab2-d5d4858a3372"). InnerVolumeSpecName "kube-api-access-dvgs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:34:47 crc kubenswrapper[4743]: I1125 16:34:47.712930 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6354099b-c55f-4913-bab2-d5d4858a3372-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:34:47 crc kubenswrapper[4743]: I1125 16:34:47.712964 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvgs2\" (UniqueName: \"kubernetes.io/projected/6354099b-c55f-4913-bab2-d5d4858a3372-kube-api-access-dvgs2\") on node \"crc\" DevicePath \"\"" Nov 25 16:34:47 crc kubenswrapper[4743]: I1125 16:34:47.724337 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6354099b-c55f-4913-bab2-d5d4858a3372-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6354099b-c55f-4913-bab2-d5d4858a3372" (UID: "6354099b-c55f-4913-bab2-d5d4858a3372"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:34:47 crc kubenswrapper[4743]: I1125 16:34:47.814935 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6354099b-c55f-4913-bab2-d5d4858a3372-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:34:48 crc kubenswrapper[4743]: I1125 16:34:48.076037 4743 generic.go:334] "Generic (PLEG): container finished" podID="6354099b-c55f-4913-bab2-d5d4858a3372" containerID="1a38b3c98ef71abc291042adfe1b24722ed9401a421c802e7a7a640c794d57f5" exitCode=0 Nov 25 16:34:48 crc kubenswrapper[4743]: I1125 16:34:48.076092 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2x8f" event={"ID":"6354099b-c55f-4913-bab2-d5d4858a3372","Type":"ContainerDied","Data":"1a38b3c98ef71abc291042adfe1b24722ed9401a421c802e7a7a640c794d57f5"} Nov 25 16:34:48 crc kubenswrapper[4743]: I1125 16:34:48.076398 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2x8f" event={"ID":"6354099b-c55f-4913-bab2-d5d4858a3372","Type":"ContainerDied","Data":"e187f24aaa3828b3fff2aec0878c2f231d2af1cccb2f33b5af50c3cf1bbb7293"} Nov 25 16:34:48 crc kubenswrapper[4743]: I1125 16:34:48.076421 4743 scope.go:117] "RemoveContainer" containerID="1a38b3c98ef71abc291042adfe1b24722ed9401a421c802e7a7a640c794d57f5" Nov 25 16:34:48 crc kubenswrapper[4743]: I1125 16:34:48.076102 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2x8f" Nov 25 16:34:48 crc kubenswrapper[4743]: I1125 16:34:48.102113 4743 scope.go:117] "RemoveContainer" containerID="39abb3d39795466ce4127f8ec2a9a43873ecb5fa81e84ba191a55a83cdf32574" Nov 25 16:34:48 crc kubenswrapper[4743]: I1125 16:34:48.104291 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c2x8f"] Nov 25 16:34:48 crc kubenswrapper[4743]: I1125 16:34:48.115078 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c2x8f"] Nov 25 16:34:48 crc kubenswrapper[4743]: I1125 16:34:48.126425 4743 scope.go:117] "RemoveContainer" containerID="28ba48b019d65acd34c11e59e4bcc9c96c164bcf6125556468d8dce0aa3d21b2" Nov 25 16:34:48 crc kubenswrapper[4743]: I1125 16:34:48.175078 4743 scope.go:117] "RemoveContainer" containerID="1a38b3c98ef71abc291042adfe1b24722ed9401a421c802e7a7a640c794d57f5" Nov 25 16:34:48 crc kubenswrapper[4743]: E1125 16:34:48.175467 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a38b3c98ef71abc291042adfe1b24722ed9401a421c802e7a7a640c794d57f5\": container with ID starting with 1a38b3c98ef71abc291042adfe1b24722ed9401a421c802e7a7a640c794d57f5 not found: ID does not exist" containerID="1a38b3c98ef71abc291042adfe1b24722ed9401a421c802e7a7a640c794d57f5" Nov 25 16:34:48 crc kubenswrapper[4743]: I1125 16:34:48.175493 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a38b3c98ef71abc291042adfe1b24722ed9401a421c802e7a7a640c794d57f5"} err="failed to get container status \"1a38b3c98ef71abc291042adfe1b24722ed9401a421c802e7a7a640c794d57f5\": rpc error: code = NotFound desc = could not find container \"1a38b3c98ef71abc291042adfe1b24722ed9401a421c802e7a7a640c794d57f5\": container with ID starting with 1a38b3c98ef71abc291042adfe1b24722ed9401a421c802e7a7a640c794d57f5 not found: ID does not exist" Nov 25 16:34:48 crc kubenswrapper[4743]: I1125 16:34:48.175515 4743 scope.go:117] "RemoveContainer" containerID="39abb3d39795466ce4127f8ec2a9a43873ecb5fa81e84ba191a55a83cdf32574" Nov 25 16:34:48 crc kubenswrapper[4743]: E1125 16:34:48.176089 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39abb3d39795466ce4127f8ec2a9a43873ecb5fa81e84ba191a55a83cdf32574\": container with ID starting with 39abb3d39795466ce4127f8ec2a9a43873ecb5fa81e84ba191a55a83cdf32574 not found: ID does not exist" containerID="39abb3d39795466ce4127f8ec2a9a43873ecb5fa81e84ba191a55a83cdf32574" Nov 25 16:34:48 crc kubenswrapper[4743]: I1125 16:34:48.176118 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39abb3d39795466ce4127f8ec2a9a43873ecb5fa81e84ba191a55a83cdf32574"} err="failed to get container status \"39abb3d39795466ce4127f8ec2a9a43873ecb5fa81e84ba191a55a83cdf32574\": rpc error: code = NotFound desc = could not find container \"39abb3d39795466ce4127f8ec2a9a43873ecb5fa81e84ba191a55a83cdf32574\": container with ID starting with 39abb3d39795466ce4127f8ec2a9a43873ecb5fa81e84ba191a55a83cdf32574 not found: ID does not exist" Nov 25 16:34:48 crc kubenswrapper[4743]: I1125 16:34:48.176142 4743 scope.go:117] "RemoveContainer" containerID="28ba48b019d65acd34c11e59e4bcc9c96c164bcf6125556468d8dce0aa3d21b2" Nov 25 16:34:48 crc kubenswrapper[4743]: E1125 16:34:48.177311 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28ba48b019d65acd34c11e59e4bcc9c96c164bcf6125556468d8dce0aa3d21b2\": container with ID starting with 28ba48b019d65acd34c11e59e4bcc9c96c164bcf6125556468d8dce0aa3d21b2 not found: ID does not exist" containerID="28ba48b019d65acd34c11e59e4bcc9c96c164bcf6125556468d8dce0aa3d21b2" Nov 25 16:34:48 crc kubenswrapper[4743]: I1125 16:34:48.177341 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28ba48b019d65acd34c11e59e4bcc9c96c164bcf6125556468d8dce0aa3d21b2"} err="failed to get container status \"28ba48b019d65acd34c11e59e4bcc9c96c164bcf6125556468d8dce0aa3d21b2\": rpc error: code = NotFound desc = could not find container \"28ba48b019d65acd34c11e59e4bcc9c96c164bcf6125556468d8dce0aa3d21b2\": container with ID starting with 28ba48b019d65acd34c11e59e4bcc9c96c164bcf6125556468d8dce0aa3d21b2 not found: ID does not exist" Nov 25 16:34:49 crc kubenswrapper[4743]: I1125 16:34:49.785855 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6354099b-c55f-4913-bab2-d5d4858a3372" path="/var/lib/kubelet/pods/6354099b-c55f-4913-bab2-d5d4858a3372/volumes" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.149372 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-49tcs"] Nov 25 16:35:17 crc kubenswrapper[4743]: E1125 16:35:17.150241 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6354099b-c55f-4913-bab2-d5d4858a3372" containerName="registry-server" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.150253 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6354099b-c55f-4913-bab2-d5d4858a3372" containerName="registry-server" Nov 25 16:35:17 crc kubenswrapper[4743]: E1125 16:35:17.150266 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6354099b-c55f-4913-bab2-d5d4858a3372" containerName="extract-utilities" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.150272 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6354099b-c55f-4913-bab2-d5d4858a3372" containerName="extract-utilities" Nov 25 16:35:17 crc kubenswrapper[4743]: E1125 16:35:17.150295 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6354099b-c55f-4913-bab2-d5d4858a3372" containerName="extract-content" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.150303 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6354099b-c55f-4913-bab2-d5d4858a3372" containerName="extract-content" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.150487 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6354099b-c55f-4913-bab2-d5d4858a3372" containerName="registry-server" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.151775 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.167846 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-49tcs"] Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.282453 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps6hz\" (UniqueName: \"kubernetes.io/projected/645114ef-8bc2-4413-90d6-d29edd710e3b-kube-api-access-ps6hz\") pod \"redhat-operators-49tcs\" (UID: \"645114ef-8bc2-4413-90d6-d29edd710e3b\") " pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.282528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645114ef-8bc2-4413-90d6-d29edd710e3b-catalog-content\") pod \"redhat-operators-49tcs\" (UID: \"645114ef-8bc2-4413-90d6-d29edd710e3b\") " pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.282944 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645114ef-8bc2-4413-90d6-d29edd710e3b-utilities\") pod \"redhat-operators-49tcs\" (UID: \"645114ef-8bc2-4413-90d6-d29edd710e3b\") " pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.384516 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645114ef-8bc2-4413-90d6-d29edd710e3b-utilities\") pod \"redhat-operators-49tcs\" (UID: \"645114ef-8bc2-4413-90d6-d29edd710e3b\") " pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.384622 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps6hz\" (UniqueName: \"kubernetes.io/projected/645114ef-8bc2-4413-90d6-d29edd710e3b-kube-api-access-ps6hz\") pod \"redhat-operators-49tcs\" (UID: \"645114ef-8bc2-4413-90d6-d29edd710e3b\") " pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.384675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645114ef-8bc2-4413-90d6-d29edd710e3b-catalog-content\") pod \"redhat-operators-49tcs\" (UID: \"645114ef-8bc2-4413-90d6-d29edd710e3b\") " pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.385167 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645114ef-8bc2-4413-90d6-d29edd710e3b-catalog-content\") pod \"redhat-operators-49tcs\" (UID: \"645114ef-8bc2-4413-90d6-d29edd710e3b\") " pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.385168 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645114ef-8bc2-4413-90d6-d29edd710e3b-utilities\") pod \"redhat-operators-49tcs\" (UID: \"645114ef-8bc2-4413-90d6-d29edd710e3b\") " pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.405326 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps6hz\" (UniqueName: \"kubernetes.io/projected/645114ef-8bc2-4413-90d6-d29edd710e3b-kube-api-access-ps6hz\") pod \"redhat-operators-49tcs\" (UID: \"645114ef-8bc2-4413-90d6-d29edd710e3b\") " pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.491780 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:17 crc kubenswrapper[4743]: I1125 16:35:17.960887 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-49tcs"] Nov 25 16:35:18 crc kubenswrapper[4743]: I1125 16:35:18.366233 4743 generic.go:334] "Generic (PLEG): container finished" podID="645114ef-8bc2-4413-90d6-d29edd710e3b" containerID="49c67aa3b617581089b956809a86aeab2a6107aebac16387ee3b1f2c3ed3fb6f" exitCode=0 Nov 25 16:35:18 crc kubenswrapper[4743]: I1125 16:35:18.366277 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49tcs" event={"ID":"645114ef-8bc2-4413-90d6-d29edd710e3b","Type":"ContainerDied","Data":"49c67aa3b617581089b956809a86aeab2a6107aebac16387ee3b1f2c3ed3fb6f"} Nov 25 16:35:18 crc kubenswrapper[4743]: I1125 16:35:18.366328 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49tcs" event={"ID":"645114ef-8bc2-4413-90d6-d29edd710e3b","Type":"ContainerStarted","Data":"f3f43592c07ad2410f3f99cf1d8778d779fcc9b502721899d00bd8cfe5562fa8"} Nov 25 16:35:18 crc kubenswrapper[4743]: I1125 16:35:18.368448 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 16:35:20 crc kubenswrapper[4743]: I1125 16:35:20.077340 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:35:20 crc kubenswrapper[4743]: I1125 16:35:20.078008 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:35:20 crc kubenswrapper[4743]: I1125 16:35:20.386445 4743 generic.go:334] "Generic (PLEG): container finished" podID="645114ef-8bc2-4413-90d6-d29edd710e3b" containerID="480ce9fa8be5b5ccad50abfe6981d9d4f50f06e396abf391a59cd373a442064c" exitCode=0 Nov 25 16:35:20 crc kubenswrapper[4743]: I1125 16:35:20.386483 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49tcs" event={"ID":"645114ef-8bc2-4413-90d6-d29edd710e3b","Type":"ContainerDied","Data":"480ce9fa8be5b5ccad50abfe6981d9d4f50f06e396abf391a59cd373a442064c"} Nov 25 16:35:25 crc kubenswrapper[4743]: I1125 16:35:25.440360 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49tcs" event={"ID":"645114ef-8bc2-4413-90d6-d29edd710e3b","Type":"ContainerStarted","Data":"4ff4d03be0e54178450f565ab0c0cd8fbb64e88a9ed3d09b4770888f0f12f9bf"} Nov 25 16:35:25 crc kubenswrapper[4743]: I1125 16:35:25.467495 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-49tcs" podStartSLOduration=2.089318846 podStartE2EDuration="8.467472361s" podCreationTimestamp="2025-11-25 16:35:17 +0000 UTC" firstStartedPulling="2025-11-25 16:35:18.368152852 +0000 UTC m=+2197.489992401" lastFinishedPulling="2025-11-25 16:35:24.746306357 +0000 UTC m=+2203.868145916" observedRunningTime="2025-11-25 16:35:25.460376547 +0000 UTC m=+2204.582216116" watchObservedRunningTime="2025-11-25 16:35:25.467472361 +0000 UTC m=+2204.589311920" Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.129270 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-shtlm"] Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.131385 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.140753 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-shtlm"] Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.267386 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlmjf\" (UniqueName: \"kubernetes.io/projected/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-kube-api-access-jlmjf\") pod \"redhat-marketplace-shtlm\" (UID: \"892a61e0-0528-4aa8-89b1-0eaa9ab8304c\") " pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.267468 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-utilities\") pod \"redhat-marketplace-shtlm\" (UID: \"892a61e0-0528-4aa8-89b1-0eaa9ab8304c\") " pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.267781 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-catalog-content\") pod \"redhat-marketplace-shtlm\" (UID: \"892a61e0-0528-4aa8-89b1-0eaa9ab8304c\") " pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.369827 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-utilities\") pod \"redhat-marketplace-shtlm\" (UID: \"892a61e0-0528-4aa8-89b1-0eaa9ab8304c\") " pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.369967 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-catalog-content\") pod \"redhat-marketplace-shtlm\" (UID: \"892a61e0-0528-4aa8-89b1-0eaa9ab8304c\") " pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.370075 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlmjf\" (UniqueName: \"kubernetes.io/projected/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-kube-api-access-jlmjf\") pod \"redhat-marketplace-shtlm\" (UID: \"892a61e0-0528-4aa8-89b1-0eaa9ab8304c\") " pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.370583 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-catalog-content\") pod \"redhat-marketplace-shtlm\" (UID: \"892a61e0-0528-4aa8-89b1-0eaa9ab8304c\") " pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.370574 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-utilities\") pod \"redhat-marketplace-shtlm\" (UID: \"892a61e0-0528-4aa8-89b1-0eaa9ab8304c\") " pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.393369 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlmjf\" (UniqueName: \"kubernetes.io/projected/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-kube-api-access-jlmjf\") pod \"redhat-marketplace-shtlm\" (UID: \"892a61e0-0528-4aa8-89b1-0eaa9ab8304c\") " pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.467808 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.492130 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.492177 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:27 crc kubenswrapper[4743]: I1125 16:35:27.944679 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-shtlm"] Nov 25 16:35:28 crc kubenswrapper[4743]: I1125 16:35:28.468574 4743 generic.go:334] "Generic (PLEG): container finished" podID="892a61e0-0528-4aa8-89b1-0eaa9ab8304c" containerID="f7ccf473814bb5bf9322639f9b56cf704b38d434493b631b0601e6849a036d05" exitCode=0 Nov 25 16:35:28 crc kubenswrapper[4743]: I1125 16:35:28.468642 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shtlm" event={"ID":"892a61e0-0528-4aa8-89b1-0eaa9ab8304c","Type":"ContainerDied","Data":"f7ccf473814bb5bf9322639f9b56cf704b38d434493b631b0601e6849a036d05"} Nov 25 16:35:28 crc kubenswrapper[4743]: I1125 16:35:28.468675 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shtlm" event={"ID":"892a61e0-0528-4aa8-89b1-0eaa9ab8304c","Type":"ContainerStarted","Data":"a2032ad374821383106b31e8e73997a3ac71bc745265c0757d2cd34e25c8a7fd"} Nov 25 16:35:28 crc kubenswrapper[4743]: I1125 16:35:28.538147 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-49tcs" podUID="645114ef-8bc2-4413-90d6-d29edd710e3b" containerName="registry-server" probeResult="failure" output=< Nov 25 16:35:28 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 25 16:35:28 crc kubenswrapper[4743]: > Nov 25 16:35:30 crc kubenswrapper[4743]: I1125 16:35:30.491199 4743 generic.go:334] "Generic (PLEG): container finished" podID="892a61e0-0528-4aa8-89b1-0eaa9ab8304c" containerID="7b88d04a85bc2884a94744e09c4069fdca134731e4f4021bf5eb2cb626baf5ed" exitCode=0 Nov 25 16:35:30 crc kubenswrapper[4743]: I1125 16:35:30.491294 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shtlm" event={"ID":"892a61e0-0528-4aa8-89b1-0eaa9ab8304c","Type":"ContainerDied","Data":"7b88d04a85bc2884a94744e09c4069fdca134731e4f4021bf5eb2cb626baf5ed"} Nov 25 16:35:31 crc kubenswrapper[4743]: I1125 16:35:31.503903 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shtlm" event={"ID":"892a61e0-0528-4aa8-89b1-0eaa9ab8304c","Type":"ContainerStarted","Data":"ec0dd900b7804c9991135fa27178c090ffab764477b95be189ab242bf57bfb35"} Nov 25 16:35:31 crc kubenswrapper[4743]: I1125 16:35:31.531392 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-shtlm" podStartSLOduration=2.080673708 podStartE2EDuration="4.531373981s" podCreationTimestamp="2025-11-25 16:35:27 +0000 UTC" firstStartedPulling="2025-11-25 16:35:28.471021493 +0000 UTC m=+2207.592861042" lastFinishedPulling="2025-11-25 16:35:30.921721766 +0000 UTC m=+2210.043561315" observedRunningTime="2025-11-25 16:35:31.523588514 +0000 UTC m=+2210.645428093" watchObservedRunningTime="2025-11-25 16:35:31.531373981 +0000 UTC m=+2210.653213520" Nov 25 16:35:37 crc kubenswrapper[4743]: I1125 16:35:37.468423 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:37 crc kubenswrapper[4743]: I1125 16:35:37.468984 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:37 crc kubenswrapper[4743]: I1125 16:35:37.516311 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:37 crc kubenswrapper[4743]: I1125 16:35:37.541118 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:37 crc kubenswrapper[4743]: I1125 16:35:37.583909 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:37 crc kubenswrapper[4743]: I1125 16:35:37.612413 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:39 crc kubenswrapper[4743]: I1125 16:35:39.748760 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-49tcs"] Nov 25 16:35:39 crc kubenswrapper[4743]: I1125 16:35:39.749251 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-49tcs" podUID="645114ef-8bc2-4413-90d6-d29edd710e3b" containerName="registry-server" containerID="cri-o://4ff4d03be0e54178450f565ab0c0cd8fbb64e88a9ed3d09b4770888f0f12f9bf" gracePeriod=2 Nov 25 16:35:39 crc kubenswrapper[4743]: I1125 16:35:39.947799 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-shtlm"] Nov 25 16:35:39 crc kubenswrapper[4743]: I1125 16:35:39.948259 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-shtlm" podUID="892a61e0-0528-4aa8-89b1-0eaa9ab8304c" containerName="registry-server" containerID="cri-o://ec0dd900b7804c9991135fa27178c090ffab764477b95be189ab242bf57bfb35" gracePeriod=2 Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.199725 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.302549 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645114ef-8bc2-4413-90d6-d29edd710e3b-catalog-content\") pod \"645114ef-8bc2-4413-90d6-d29edd710e3b\" (UID: \"645114ef-8bc2-4413-90d6-d29edd710e3b\") " Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.302625 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645114ef-8bc2-4413-90d6-d29edd710e3b-utilities\") pod \"645114ef-8bc2-4413-90d6-d29edd710e3b\" (UID: \"645114ef-8bc2-4413-90d6-d29edd710e3b\") " Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.302688 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps6hz\" (UniqueName: \"kubernetes.io/projected/645114ef-8bc2-4413-90d6-d29edd710e3b-kube-api-access-ps6hz\") pod \"645114ef-8bc2-4413-90d6-d29edd710e3b\" (UID: \"645114ef-8bc2-4413-90d6-d29edd710e3b\") " Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.303435 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/645114ef-8bc2-4413-90d6-d29edd710e3b-utilities" (OuterVolumeSpecName: "utilities") pod "645114ef-8bc2-4413-90d6-d29edd710e3b" (UID: "645114ef-8bc2-4413-90d6-d29edd710e3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.308777 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/645114ef-8bc2-4413-90d6-d29edd710e3b-kube-api-access-ps6hz" (OuterVolumeSpecName: "kube-api-access-ps6hz") pod "645114ef-8bc2-4413-90d6-d29edd710e3b" (UID: "645114ef-8bc2-4413-90d6-d29edd710e3b"). InnerVolumeSpecName "kube-api-access-ps6hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.361142 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.391761 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/645114ef-8bc2-4413-90d6-d29edd710e3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "645114ef-8bc2-4413-90d6-d29edd710e3b" (UID: "645114ef-8bc2-4413-90d6-d29edd710e3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.404406 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-catalog-content\") pod \"892a61e0-0528-4aa8-89b1-0eaa9ab8304c\" (UID: \"892a61e0-0528-4aa8-89b1-0eaa9ab8304c\") " Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.404623 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlmjf\" (UniqueName: \"kubernetes.io/projected/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-kube-api-access-jlmjf\") pod \"892a61e0-0528-4aa8-89b1-0eaa9ab8304c\" (UID: \"892a61e0-0528-4aa8-89b1-0eaa9ab8304c\") " Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.404897 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-utilities\") pod \"892a61e0-0528-4aa8-89b1-0eaa9ab8304c\" (UID: \"892a61e0-0528-4aa8-89b1-0eaa9ab8304c\") " Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.405621 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps6hz\" (UniqueName: \"kubernetes.io/projected/645114ef-8bc2-4413-90d6-d29edd710e3b-kube-api-access-ps6hz\") on node \"crc\" DevicePath \"\"" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.405728 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645114ef-8bc2-4413-90d6-d29edd710e3b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.405810 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645114ef-8bc2-4413-90d6-d29edd710e3b-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.405841 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-utilities" (OuterVolumeSpecName: "utilities") pod "892a61e0-0528-4aa8-89b1-0eaa9ab8304c" (UID: "892a61e0-0528-4aa8-89b1-0eaa9ab8304c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.407972 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-kube-api-access-jlmjf" (OuterVolumeSpecName: "kube-api-access-jlmjf") pod "892a61e0-0528-4aa8-89b1-0eaa9ab8304c" (UID: "892a61e0-0528-4aa8-89b1-0eaa9ab8304c"). InnerVolumeSpecName "kube-api-access-jlmjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.421438 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "892a61e0-0528-4aa8-89b1-0eaa9ab8304c" (UID: "892a61e0-0528-4aa8-89b1-0eaa9ab8304c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.507803 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlmjf\" (UniqueName: \"kubernetes.io/projected/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-kube-api-access-jlmjf\") on node \"crc\" DevicePath \"\"" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.507837 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.507846 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/892a61e0-0528-4aa8-89b1-0eaa9ab8304c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.603778 4743 generic.go:334] "Generic (PLEG): container finished" podID="892a61e0-0528-4aa8-89b1-0eaa9ab8304c" containerID="ec0dd900b7804c9991135fa27178c090ffab764477b95be189ab242bf57bfb35" exitCode=0 Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.603844 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shtlm" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.603860 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shtlm" event={"ID":"892a61e0-0528-4aa8-89b1-0eaa9ab8304c","Type":"ContainerDied","Data":"ec0dd900b7804c9991135fa27178c090ffab764477b95be189ab242bf57bfb35"} Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.603885 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shtlm" event={"ID":"892a61e0-0528-4aa8-89b1-0eaa9ab8304c","Type":"ContainerDied","Data":"a2032ad374821383106b31e8e73997a3ac71bc745265c0757d2cd34e25c8a7fd"} Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.603903 4743 scope.go:117] "RemoveContainer" containerID="ec0dd900b7804c9991135fa27178c090ffab764477b95be189ab242bf57bfb35" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.609055 4743 generic.go:334] "Generic (PLEG): container finished" podID="645114ef-8bc2-4413-90d6-d29edd710e3b" containerID="4ff4d03be0e54178450f565ab0c0cd8fbb64e88a9ed3d09b4770888f0f12f9bf" exitCode=0 Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.609109 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49tcs" event={"ID":"645114ef-8bc2-4413-90d6-d29edd710e3b","Type":"ContainerDied","Data":"4ff4d03be0e54178450f565ab0c0cd8fbb64e88a9ed3d09b4770888f0f12f9bf"} Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.609129 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49tcs" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.609163 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49tcs" event={"ID":"645114ef-8bc2-4413-90d6-d29edd710e3b","Type":"ContainerDied","Data":"f3f43592c07ad2410f3f99cf1d8778d779fcc9b502721899d00bd8cfe5562fa8"} Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.639894 4743 scope.go:117] "RemoveContainer" containerID="7b88d04a85bc2884a94744e09c4069fdca134731e4f4021bf5eb2cb626baf5ed" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.641909 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-shtlm"] Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.654975 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-shtlm"] Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.665687 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-49tcs"] Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.666952 4743 scope.go:117] "RemoveContainer" containerID="f7ccf473814bb5bf9322639f9b56cf704b38d434493b631b0601e6849a036d05" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.677231 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-49tcs"] Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.686792 4743 scope.go:117] "RemoveContainer" containerID="ec0dd900b7804c9991135fa27178c090ffab764477b95be189ab242bf57bfb35" Nov 25 16:35:40 crc kubenswrapper[4743]: E1125 16:35:40.687209 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec0dd900b7804c9991135fa27178c090ffab764477b95be189ab242bf57bfb35\": container with ID starting with ec0dd900b7804c9991135fa27178c090ffab764477b95be189ab242bf57bfb35 not found: ID does not exist" containerID="ec0dd900b7804c9991135fa27178c090ffab764477b95be189ab242bf57bfb35" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.687245 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0dd900b7804c9991135fa27178c090ffab764477b95be189ab242bf57bfb35"} err="failed to get container status \"ec0dd900b7804c9991135fa27178c090ffab764477b95be189ab242bf57bfb35\": rpc error: code = NotFound desc = could not find container \"ec0dd900b7804c9991135fa27178c090ffab764477b95be189ab242bf57bfb35\": container with ID starting with ec0dd900b7804c9991135fa27178c090ffab764477b95be189ab242bf57bfb35 not found: ID does not exist" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.687270 4743 scope.go:117] "RemoveContainer" containerID="7b88d04a85bc2884a94744e09c4069fdca134731e4f4021bf5eb2cb626baf5ed" Nov 25 16:35:40 crc kubenswrapper[4743]: E1125 16:35:40.687697 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b88d04a85bc2884a94744e09c4069fdca134731e4f4021bf5eb2cb626baf5ed\": container with ID starting with 7b88d04a85bc2884a94744e09c4069fdca134731e4f4021bf5eb2cb626baf5ed not found: ID does not exist" containerID="7b88d04a85bc2884a94744e09c4069fdca134731e4f4021bf5eb2cb626baf5ed" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.687730 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b88d04a85bc2884a94744e09c4069fdca134731e4f4021bf5eb2cb626baf5ed"} err="failed to get container status \"7b88d04a85bc2884a94744e09c4069fdca134731e4f4021bf5eb2cb626baf5ed\": rpc error: code = NotFound desc = could not find container \"7b88d04a85bc2884a94744e09c4069fdca134731e4f4021bf5eb2cb626baf5ed\": container with ID starting with 7b88d04a85bc2884a94744e09c4069fdca134731e4f4021bf5eb2cb626baf5ed not found: ID does not exist" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.687752 4743 scope.go:117] "RemoveContainer" containerID="f7ccf473814bb5bf9322639f9b56cf704b38d434493b631b0601e6849a036d05" Nov 25 16:35:40 crc kubenswrapper[4743]: E1125 16:35:40.688018 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7ccf473814bb5bf9322639f9b56cf704b38d434493b631b0601e6849a036d05\": container with ID starting with f7ccf473814bb5bf9322639f9b56cf704b38d434493b631b0601e6849a036d05 not found: ID does not exist" containerID="f7ccf473814bb5bf9322639f9b56cf704b38d434493b631b0601e6849a036d05" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.688043 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ccf473814bb5bf9322639f9b56cf704b38d434493b631b0601e6849a036d05"} err="failed to get container status \"f7ccf473814bb5bf9322639f9b56cf704b38d434493b631b0601e6849a036d05\": rpc error: code = NotFound desc = could not find container \"f7ccf473814bb5bf9322639f9b56cf704b38d434493b631b0601e6849a036d05\": container with ID starting with f7ccf473814bb5bf9322639f9b56cf704b38d434493b631b0601e6849a036d05 not found: ID does not exist" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.688059 4743 scope.go:117] "RemoveContainer" containerID="4ff4d03be0e54178450f565ab0c0cd8fbb64e88a9ed3d09b4770888f0f12f9bf" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.710051 4743 scope.go:117] "RemoveContainer" containerID="480ce9fa8be5b5ccad50abfe6981d9d4f50f06e396abf391a59cd373a442064c" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.740144 4743 scope.go:117] "RemoveContainer" containerID="49c67aa3b617581089b956809a86aeab2a6107aebac16387ee3b1f2c3ed3fb6f" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.761287 4743 scope.go:117] "RemoveContainer" containerID="4ff4d03be0e54178450f565ab0c0cd8fbb64e88a9ed3d09b4770888f0f12f9bf" Nov 25 16:35:40 crc kubenswrapper[4743]: E1125 16:35:40.762120 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ff4d03be0e54178450f565ab0c0cd8fbb64e88a9ed3d09b4770888f0f12f9bf\": container with ID starting with 4ff4d03be0e54178450f565ab0c0cd8fbb64e88a9ed3d09b4770888f0f12f9bf not found: ID does not exist" containerID="4ff4d03be0e54178450f565ab0c0cd8fbb64e88a9ed3d09b4770888f0f12f9bf" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.762159 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ff4d03be0e54178450f565ab0c0cd8fbb64e88a9ed3d09b4770888f0f12f9bf"} err="failed to get container status \"4ff4d03be0e54178450f565ab0c0cd8fbb64e88a9ed3d09b4770888f0f12f9bf\": rpc error: code = NotFound desc = could not find container \"4ff4d03be0e54178450f565ab0c0cd8fbb64e88a9ed3d09b4770888f0f12f9bf\": container with ID starting with 4ff4d03be0e54178450f565ab0c0cd8fbb64e88a9ed3d09b4770888f0f12f9bf not found: ID does not exist" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.762183 4743 scope.go:117] "RemoveContainer" containerID="480ce9fa8be5b5ccad50abfe6981d9d4f50f06e396abf391a59cd373a442064c" Nov 25 16:35:40 crc kubenswrapper[4743]: E1125 16:35:40.762672 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"480ce9fa8be5b5ccad50abfe6981d9d4f50f06e396abf391a59cd373a442064c\": container with ID starting with 480ce9fa8be5b5ccad50abfe6981d9d4f50f06e396abf391a59cd373a442064c not found: ID does not exist" containerID="480ce9fa8be5b5ccad50abfe6981d9d4f50f06e396abf391a59cd373a442064c" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.762696 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480ce9fa8be5b5ccad50abfe6981d9d4f50f06e396abf391a59cd373a442064c"} err="failed to get container status \"480ce9fa8be5b5ccad50abfe6981d9d4f50f06e396abf391a59cd373a442064c\": rpc error: code = NotFound desc = could not find container \"480ce9fa8be5b5ccad50abfe6981d9d4f50f06e396abf391a59cd373a442064c\": container with ID starting with 480ce9fa8be5b5ccad50abfe6981d9d4f50f06e396abf391a59cd373a442064c not found: ID does not exist" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.762711 4743 scope.go:117] "RemoveContainer" containerID="49c67aa3b617581089b956809a86aeab2a6107aebac16387ee3b1f2c3ed3fb6f" Nov 25 16:35:40 crc kubenswrapper[4743]: E1125 16:35:40.763198 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49c67aa3b617581089b956809a86aeab2a6107aebac16387ee3b1f2c3ed3fb6f\": container with ID starting with 49c67aa3b617581089b956809a86aeab2a6107aebac16387ee3b1f2c3ed3fb6f not found: ID does not exist" containerID="49c67aa3b617581089b956809a86aeab2a6107aebac16387ee3b1f2c3ed3fb6f" Nov 25 16:35:40 crc kubenswrapper[4743]: I1125 16:35:40.763227 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c67aa3b617581089b956809a86aeab2a6107aebac16387ee3b1f2c3ed3fb6f"} err="failed to get container status \"49c67aa3b617581089b956809a86aeab2a6107aebac16387ee3b1f2c3ed3fb6f\": rpc error: code = NotFound desc = could not find container \"49c67aa3b617581089b956809a86aeab2a6107aebac16387ee3b1f2c3ed3fb6f\": container with ID starting with 49c67aa3b617581089b956809a86aeab2a6107aebac16387ee3b1f2c3ed3fb6f not found: ID does not exist" Nov 25 16:35:41 crc kubenswrapper[4743]: I1125 16:35:41.788925 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="645114ef-8bc2-4413-90d6-d29edd710e3b" path="/var/lib/kubelet/pods/645114ef-8bc2-4413-90d6-d29edd710e3b/volumes" Nov 25 16:35:41 crc kubenswrapper[4743]: I1125 16:35:41.790058 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="892a61e0-0528-4aa8-89b1-0eaa9ab8304c" path="/var/lib/kubelet/pods/892a61e0-0528-4aa8-89b1-0eaa9ab8304c/volumes" Nov 25 16:35:50 crc kubenswrapper[4743]: I1125 16:35:50.077730 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:35:50 crc kubenswrapper[4743]: I1125 16:35:50.078498 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:36:20 crc kubenswrapper[4743]: I1125 16:36:20.077616 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:36:20 crc kubenswrapper[4743]: I1125 16:36:20.078164 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:36:20 crc kubenswrapper[4743]: I1125 16:36:20.078228 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 16:36:20 crc kubenswrapper[4743]: I1125 16:36:20.079267 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:36:20 crc kubenswrapper[4743]: I1125 16:36:20.079374 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" gracePeriod=600 Nov 25 16:36:20 crc kubenswrapper[4743]: E1125 16:36:20.261354 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:36:20 crc kubenswrapper[4743]: I1125 16:36:20.949199 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" exitCode=0 Nov 25 16:36:20 crc kubenswrapper[4743]: I1125 16:36:20.949282 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964"} Nov 25 16:36:20 crc kubenswrapper[4743]: I1125 16:36:20.949636 4743 scope.go:117] "RemoveContainer" containerID="61ddd1cf4766f7f9fdf9d1bbccdeb4e1e763abd124bcc4fda1e5e4965acde9ac" Nov 25 16:36:20 crc kubenswrapper[4743]: I1125 16:36:20.950217 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:36:20 crc kubenswrapper[4743]: E1125 16:36:20.950467 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:36:32 crc kubenswrapper[4743]: I1125 16:36:32.775784 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:36:32 crc kubenswrapper[4743]: E1125 16:36:32.776471 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:36:44 crc kubenswrapper[4743]: I1125 16:36:44.774538 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:36:44 crc kubenswrapper[4743]: E1125 16:36:44.775310 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:36:59 crc kubenswrapper[4743]: I1125 16:36:59.775129 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:36:59 crc kubenswrapper[4743]: E1125 16:36:59.776038 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:37:12 crc kubenswrapper[4743]: I1125 16:37:12.774897 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:37:12 crc kubenswrapper[4743]: E1125 16:37:12.775505 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:37:27 crc kubenswrapper[4743]: I1125 16:37:27.775730 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:37:27 crc kubenswrapper[4743]: E1125 16:37:27.776782 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:37:39 crc kubenswrapper[4743]: I1125 16:37:39.775083 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:37:39 crc kubenswrapper[4743]: E1125 16:37:39.775900 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:37:42 crc kubenswrapper[4743]: I1125 16:37:42.699553 4743 generic.go:334] "Generic (PLEG): container finished" podID="7568caf6-7fa3-429a-90f2-40cbd4dece9d" containerID="d4d94659818210ed0c5147479ecb9884a7ac056d21fe97f0c1bf37bb79904702" exitCode=0 Nov 25 16:37:42 crc kubenswrapper[4743]: I1125 16:37:42.699668 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" event={"ID":"7568caf6-7fa3-429a-90f2-40cbd4dece9d","Type":"ContainerDied","Data":"d4d94659818210ed0c5147479ecb9884a7ac056d21fe97f0c1bf37bb79904702"} Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.088172 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.241894 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-libvirt-secret-0\") pod \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.242085 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-794h5\" (UniqueName: \"kubernetes.io/projected/7568caf6-7fa3-429a-90f2-40cbd4dece9d-kube-api-access-794h5\") pod \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.242109 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-libvirt-combined-ca-bundle\") pod \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.242179 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-inventory\") pod \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.242729 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-ssh-key\") pod \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\" (UID: \"7568caf6-7fa3-429a-90f2-40cbd4dece9d\") " Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.417728 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7568caf6-7fa3-429a-90f2-40cbd4dece9d" (UID: "7568caf6-7fa3-429a-90f2-40cbd4dece9d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.419133 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.426242 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7568caf6-7fa3-429a-90f2-40cbd4dece9d-kube-api-access-794h5" (OuterVolumeSpecName: "kube-api-access-794h5") pod "7568caf6-7fa3-429a-90f2-40cbd4dece9d" (UID: "7568caf6-7fa3-429a-90f2-40cbd4dece9d"). InnerVolumeSpecName "kube-api-access-794h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.427545 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-inventory" (OuterVolumeSpecName: "inventory") pod "7568caf6-7fa3-429a-90f2-40cbd4dece9d" (UID: "7568caf6-7fa3-429a-90f2-40cbd4dece9d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.428377 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7568caf6-7fa3-429a-90f2-40cbd4dece9d" (UID: "7568caf6-7fa3-429a-90f2-40cbd4dece9d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.428420 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7568caf6-7fa3-429a-90f2-40cbd4dece9d" (UID: "7568caf6-7fa3-429a-90f2-40cbd4dece9d"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.520808 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.520868 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.520881 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-794h5\" (UniqueName: \"kubernetes.io/projected/7568caf6-7fa3-429a-90f2-40cbd4dece9d-kube-api-access-794h5\") on node \"crc\" DevicePath \"\"" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.520891 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7568caf6-7fa3-429a-90f2-40cbd4dece9d-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.719779 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" event={"ID":"7568caf6-7fa3-429a-90f2-40cbd4dece9d","Type":"ContainerDied","Data":"d3064ade5b850d6053079117fec7e35b6935a213f1f2ead92564baccab245c02"} Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.719821 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3064ade5b850d6053079117fec7e35b6935a213f1f2ead92564baccab245c02" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.719862 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.803754 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2"] Nov 25 16:37:44 crc kubenswrapper[4743]: E1125 16:37:44.804113 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="892a61e0-0528-4aa8-89b1-0eaa9ab8304c" containerName="registry-server" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.804129 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="892a61e0-0528-4aa8-89b1-0eaa9ab8304c" containerName="registry-server" Nov 25 16:37:44 crc kubenswrapper[4743]: E1125 16:37:44.804145 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="645114ef-8bc2-4413-90d6-d29edd710e3b" containerName="extract-utilities" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.804152 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="645114ef-8bc2-4413-90d6-d29edd710e3b" containerName="extract-utilities" Nov 25 16:37:44 crc kubenswrapper[4743]: E1125 16:37:44.804168 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="645114ef-8bc2-4413-90d6-d29edd710e3b" containerName="registry-server" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.804176 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="645114ef-8bc2-4413-90d6-d29edd710e3b" containerName="registry-server" Nov 25 16:37:44 crc kubenswrapper[4743]: E1125 16:37:44.804193 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="645114ef-8bc2-4413-90d6-d29edd710e3b" containerName="extract-content" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.804199 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="645114ef-8bc2-4413-90d6-d29edd710e3b" containerName="extract-content" Nov 25 16:37:44 crc kubenswrapper[4743]: E1125 16:37:44.804207 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="892a61e0-0528-4aa8-89b1-0eaa9ab8304c" containerName="extract-content" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.804213 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="892a61e0-0528-4aa8-89b1-0eaa9ab8304c" containerName="extract-content" Nov 25 16:37:44 crc kubenswrapper[4743]: E1125 16:37:44.804234 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="892a61e0-0528-4aa8-89b1-0eaa9ab8304c" containerName="extract-utilities" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.804242 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="892a61e0-0528-4aa8-89b1-0eaa9ab8304c" containerName="extract-utilities" Nov 25 16:37:44 crc kubenswrapper[4743]: E1125 16:37:44.804252 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7568caf6-7fa3-429a-90f2-40cbd4dece9d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.804259 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7568caf6-7fa3-429a-90f2-40cbd4dece9d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.804410 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="645114ef-8bc2-4413-90d6-d29edd710e3b" containerName="registry-server" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.804427 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="892a61e0-0528-4aa8-89b1-0eaa9ab8304c" containerName="registry-server" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.804441 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7568caf6-7fa3-429a-90f2-40cbd4dece9d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.805109 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.807793 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.807991 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.808108 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.808268 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.809392 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.809517 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.809735 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.854238 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2"] Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.931113 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.931196 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.931221 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.931239 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.931398 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kt9w\" (UniqueName: \"kubernetes.io/projected/a80ee7c3-2c23-4079-994f-b04e8a21516e-kube-api-access-2kt9w\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.931544 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.931593 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.932016 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:44 crc kubenswrapper[4743]: I1125 16:37:44.932233 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.034017 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.034118 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.034164 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.034182 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.034212 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.034246 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kt9w\" (UniqueName: \"kubernetes.io/projected/a80ee7c3-2c23-4079-994f-b04e8a21516e-kube-api-access-2kt9w\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.034288 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.034308 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.034335 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.035070 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.040008 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.040030 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.040221 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.040567 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.040796 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.041214 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.041304 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.051923 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kt9w\" (UniqueName: \"kubernetes.io/projected/a80ee7c3-2c23-4079-994f-b04e8a21516e-kube-api-access-2kt9w\") pod \"nova-edpm-deployment-openstack-edpm-ipam-wfdh2\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.123455 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.629202 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2"] Nov 25 16:37:45 crc kubenswrapper[4743]: I1125 16:37:45.729682 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" event={"ID":"a80ee7c3-2c23-4079-994f-b04e8a21516e","Type":"ContainerStarted","Data":"15768bb1e5d8dcddb665a538b6294beb73627cdeba3349de67954250fe6f6346"} Nov 25 16:37:46 crc kubenswrapper[4743]: I1125 16:37:46.738554 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" event={"ID":"a80ee7c3-2c23-4079-994f-b04e8a21516e","Type":"ContainerStarted","Data":"c29c523eccedbbb0a0150841c8c3194be83f053d87bf18a23e5b87e209dd3c69"} Nov 25 16:37:46 crc kubenswrapper[4743]: I1125 16:37:46.755871 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" podStartSLOduration=2.352769806 podStartE2EDuration="2.755853154s" podCreationTimestamp="2025-11-25 16:37:44 +0000 UTC" firstStartedPulling="2025-11-25 16:37:45.633825341 +0000 UTC m=+2344.755664890" lastFinishedPulling="2025-11-25 16:37:46.036908689 +0000 UTC m=+2345.158748238" observedRunningTime="2025-11-25 16:37:46.752062514 +0000 UTC m=+2345.873902083" watchObservedRunningTime="2025-11-25 16:37:46.755853154 +0000 UTC m=+2345.877692703" Nov 25 16:37:50 crc kubenswrapper[4743]: I1125 16:37:50.775267 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:37:50 crc kubenswrapper[4743]: E1125 16:37:50.775872 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:38:01 crc kubenswrapper[4743]: I1125 16:38:01.781316 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:38:01 crc kubenswrapper[4743]: E1125 16:38:01.782682 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:38:13 crc kubenswrapper[4743]: I1125 16:38:13.774610 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:38:13 crc kubenswrapper[4743]: E1125 16:38:13.775401 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:38:25 crc kubenswrapper[4743]: I1125 16:38:25.776099 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:38:25 crc kubenswrapper[4743]: E1125 16:38:25.777385 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:38:37 crc kubenswrapper[4743]: I1125 16:38:37.775272 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:38:37 crc kubenswrapper[4743]: E1125 16:38:37.776171 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:38:49 crc kubenswrapper[4743]: I1125 16:38:49.775087 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:38:49 crc kubenswrapper[4743]: E1125 16:38:49.776005 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:39:04 crc kubenswrapper[4743]: I1125 16:39:04.775751 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:39:04 crc kubenswrapper[4743]: E1125 16:39:04.776558 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:39:17 crc kubenswrapper[4743]: I1125 16:39:17.775464 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:39:17 crc kubenswrapper[4743]: E1125 16:39:17.776240 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:39:30 crc kubenswrapper[4743]: I1125 16:39:30.774657 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:39:30 crc kubenswrapper[4743]: E1125 16:39:30.775416 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:39:41 crc kubenswrapper[4743]: I1125 16:39:41.780780 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:39:41 crc kubenswrapper[4743]: E1125 16:39:41.781531 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:39:52 crc kubenswrapper[4743]: I1125 16:39:52.775292 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:39:52 crc kubenswrapper[4743]: E1125 16:39:52.776063 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:40:04 crc kubenswrapper[4743]: I1125 16:40:04.774534 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:40:04 crc kubenswrapper[4743]: E1125 16:40:04.775512 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:40:15 crc kubenswrapper[4743]: I1125 16:40:15.775316 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:40:15 crc kubenswrapper[4743]: E1125 16:40:15.776497 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:40:27 crc kubenswrapper[4743]: I1125 16:40:27.774822 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:40:27 crc kubenswrapper[4743]: E1125 16:40:27.775539 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:40:28 crc kubenswrapper[4743]: I1125 16:40:28.538554 4743 generic.go:334] "Generic (PLEG): container finished" podID="a80ee7c3-2c23-4079-994f-b04e8a21516e" containerID="c29c523eccedbbb0a0150841c8c3194be83f053d87bf18a23e5b87e209dd3c69" exitCode=0 Nov 25 16:40:28 crc kubenswrapper[4743]: I1125 16:40:28.538625 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" event={"ID":"a80ee7c3-2c23-4079-994f-b04e8a21516e","Type":"ContainerDied","Data":"c29c523eccedbbb0a0150841c8c3194be83f053d87bf18a23e5b87e209dd3c69"} Nov 25 16:40:29 crc kubenswrapper[4743]: I1125 16:40:29.948279 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.132845 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-cell1-compute-config-1\") pod \"a80ee7c3-2c23-4079-994f-b04e8a21516e\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.132900 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-migration-ssh-key-0\") pod \"a80ee7c3-2c23-4079-994f-b04e8a21516e\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.132926 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-inventory\") pod \"a80ee7c3-2c23-4079-994f-b04e8a21516e\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.132956 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-extra-config-0\") pod \"a80ee7c3-2c23-4079-994f-b04e8a21516e\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.133000 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-combined-ca-bundle\") pod \"a80ee7c3-2c23-4079-994f-b04e8a21516e\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.133149 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-migration-ssh-key-1\") pod \"a80ee7c3-2c23-4079-994f-b04e8a21516e\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.133201 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-cell1-compute-config-0\") pod \"a80ee7c3-2c23-4079-994f-b04e8a21516e\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.133271 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-ssh-key\") pod \"a80ee7c3-2c23-4079-994f-b04e8a21516e\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.133326 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kt9w\" (UniqueName: \"kubernetes.io/projected/a80ee7c3-2c23-4079-994f-b04e8a21516e-kube-api-access-2kt9w\") pod \"a80ee7c3-2c23-4079-994f-b04e8a21516e\" (UID: \"a80ee7c3-2c23-4079-994f-b04e8a21516e\") " Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.141831 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a80ee7c3-2c23-4079-994f-b04e8a21516e" (UID: "a80ee7c3-2c23-4079-994f-b04e8a21516e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.143364 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a80ee7c3-2c23-4079-994f-b04e8a21516e-kube-api-access-2kt9w" (OuterVolumeSpecName: "kube-api-access-2kt9w") pod "a80ee7c3-2c23-4079-994f-b04e8a21516e" (UID: "a80ee7c3-2c23-4079-994f-b04e8a21516e"). InnerVolumeSpecName "kube-api-access-2kt9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.159389 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "a80ee7c3-2c23-4079-994f-b04e8a21516e" (UID: "a80ee7c3-2c23-4079-994f-b04e8a21516e"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.164324 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a80ee7c3-2c23-4079-994f-b04e8a21516e" (UID: "a80ee7c3-2c23-4079-994f-b04e8a21516e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.165330 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "a80ee7c3-2c23-4079-994f-b04e8a21516e" (UID: "a80ee7c3-2c23-4079-994f-b04e8a21516e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.167635 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-inventory" (OuterVolumeSpecName: "inventory") pod "a80ee7c3-2c23-4079-994f-b04e8a21516e" (UID: "a80ee7c3-2c23-4079-994f-b04e8a21516e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.170943 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "a80ee7c3-2c23-4079-994f-b04e8a21516e" (UID: "a80ee7c3-2c23-4079-994f-b04e8a21516e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.180332 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "a80ee7c3-2c23-4079-994f-b04e8a21516e" (UID: "a80ee7c3-2c23-4079-994f-b04e8a21516e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.189669 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "a80ee7c3-2c23-4079-994f-b04e8a21516e" (UID: "a80ee7c3-2c23-4079-994f-b04e8a21516e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.234926 4743 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.234957 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.234967 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.234976 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kt9w\" (UniqueName: \"kubernetes.io/projected/a80ee7c3-2c23-4079-994f-b04e8a21516e-kube-api-access-2kt9w\") on node \"crc\" DevicePath \"\"" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.234984 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.234993 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.235002 4743 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.235010 4743 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.235021 4743 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a80ee7c3-2c23-4079-994f-b04e8a21516e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.558968 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" event={"ID":"a80ee7c3-2c23-4079-994f-b04e8a21516e","Type":"ContainerDied","Data":"15768bb1e5d8dcddb665a538b6294beb73627cdeba3349de67954250fe6f6346"} Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.559158 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15768bb1e5d8dcddb665a538b6294beb73627cdeba3349de67954250fe6f6346" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.559059 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-wfdh2" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.668641 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql"] Nov 25 16:40:30 crc kubenswrapper[4743]: E1125 16:40:30.669124 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a80ee7c3-2c23-4079-994f-b04e8a21516e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.669149 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a80ee7c3-2c23-4079-994f-b04e8a21516e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.669379 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a80ee7c3-2c23-4079-994f-b04e8a21516e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.670193 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.673642 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.673897 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.674052 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.674305 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.674642 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ktslx" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.688997 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql"] Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.845283 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.845673 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.845720 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.845772 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.845929 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.845996 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.846208 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k27zg\" (UniqueName: \"kubernetes.io/projected/4dd9e80f-8e99-46a6-b669-b2ec10285463-kube-api-access-k27zg\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.948675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k27zg\" (UniqueName: \"kubernetes.io/projected/4dd9e80f-8e99-46a6-b669-b2ec10285463-kube-api-access-k27zg\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.948757 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.948945 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.949000 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.949105 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.949273 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.950188 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.953577 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.953847 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.953896 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.954356 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.955508 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.955874 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.967580 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k27zg\" (UniqueName: \"kubernetes.io/projected/4dd9e80f-8e99-46a6-b669-b2ec10285463-kube-api-access-k27zg\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:30 crc kubenswrapper[4743]: I1125 16:40:30.994361 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:40:31 crc kubenswrapper[4743]: I1125 16:40:31.494740 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql"] Nov 25 16:40:31 crc kubenswrapper[4743]: I1125 16:40:31.501110 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 16:40:31 crc kubenswrapper[4743]: I1125 16:40:31.571342 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" event={"ID":"4dd9e80f-8e99-46a6-b669-b2ec10285463","Type":"ContainerStarted","Data":"808c4494128568d25147caa138ff453f0452eccb50b936def269c0c789a741ea"} Nov 25 16:40:33 crc kubenswrapper[4743]: I1125 16:40:33.589330 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" event={"ID":"4dd9e80f-8e99-46a6-b669-b2ec10285463","Type":"ContainerStarted","Data":"67016810fac88e2b2f26eeaa26303a21f30d20496af743b849f794b4811a007f"} Nov 25 16:40:33 crc kubenswrapper[4743]: I1125 16:40:33.623172 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" podStartSLOduration=2.780065837 podStartE2EDuration="3.623147376s" podCreationTimestamp="2025-11-25 16:40:30 +0000 UTC" firstStartedPulling="2025-11-25 16:40:31.500890905 +0000 UTC m=+2510.622730454" lastFinishedPulling="2025-11-25 16:40:32.343972424 +0000 UTC m=+2511.465811993" observedRunningTime="2025-11-25 16:40:33.603919429 +0000 UTC m=+2512.725759008" watchObservedRunningTime="2025-11-25 16:40:33.623147376 +0000 UTC m=+2512.744986935" Nov 25 16:40:40 crc kubenswrapper[4743]: I1125 16:40:40.775053 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:40:40 crc kubenswrapper[4743]: E1125 16:40:40.775932 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:40:54 crc kubenswrapper[4743]: I1125 16:40:54.775731 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:40:54 crc kubenswrapper[4743]: E1125 16:40:54.776995 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:41:05 crc kubenswrapper[4743]: I1125 16:41:05.774418 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:41:05 crc kubenswrapper[4743]: E1125 16:41:05.775327 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:41:18 crc kubenswrapper[4743]: I1125 16:41:18.774686 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:41:18 crc kubenswrapper[4743]: E1125 16:41:18.775348 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:41:32 crc kubenswrapper[4743]: I1125 16:41:32.775471 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:41:33 crc kubenswrapper[4743]: I1125 16:41:33.107056 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"8ebb7d1bfffe1dc1dcdba8c833e72997ef35af50a18f0336dc5b31cea610b869"} Nov 25 16:42:53 crc kubenswrapper[4743]: I1125 16:42:53.806199 4743 generic.go:334] "Generic (PLEG): container finished" podID="4dd9e80f-8e99-46a6-b669-b2ec10285463" containerID="67016810fac88e2b2f26eeaa26303a21f30d20496af743b849f794b4811a007f" exitCode=0 Nov 25 16:42:53 crc kubenswrapper[4743]: I1125 16:42:53.806299 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" event={"ID":"4dd9e80f-8e99-46a6-b669-b2ec10285463","Type":"ContainerDied","Data":"67016810fac88e2b2f26eeaa26303a21f30d20496af743b849f794b4811a007f"} Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.189121 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.263221 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-telemetry-combined-ca-bundle\") pod \"4dd9e80f-8e99-46a6-b669-b2ec10285463\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.263361 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-inventory\") pod \"4dd9e80f-8e99-46a6-b669-b2ec10285463\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.263388 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-1\") pod \"4dd9e80f-8e99-46a6-b669-b2ec10285463\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.263434 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ssh-key\") pod \"4dd9e80f-8e99-46a6-b669-b2ec10285463\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.263459 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-0\") pod \"4dd9e80f-8e99-46a6-b669-b2ec10285463\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.263537 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-2\") pod \"4dd9e80f-8e99-46a6-b669-b2ec10285463\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.263625 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k27zg\" (UniqueName: \"kubernetes.io/projected/4dd9e80f-8e99-46a6-b669-b2ec10285463-kube-api-access-k27zg\") pod \"4dd9e80f-8e99-46a6-b669-b2ec10285463\" (UID: \"4dd9e80f-8e99-46a6-b669-b2ec10285463\") " Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.276798 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "4dd9e80f-8e99-46a6-b669-b2ec10285463" (UID: "4dd9e80f-8e99-46a6-b669-b2ec10285463"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.278774 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd9e80f-8e99-46a6-b669-b2ec10285463-kube-api-access-k27zg" (OuterVolumeSpecName: "kube-api-access-k27zg") pod "4dd9e80f-8e99-46a6-b669-b2ec10285463" (UID: "4dd9e80f-8e99-46a6-b669-b2ec10285463"). InnerVolumeSpecName "kube-api-access-k27zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.299687 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-inventory" (OuterVolumeSpecName: "inventory") pod "4dd9e80f-8e99-46a6-b669-b2ec10285463" (UID: "4dd9e80f-8e99-46a6-b669-b2ec10285463"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.302571 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4dd9e80f-8e99-46a6-b669-b2ec10285463" (UID: "4dd9e80f-8e99-46a6-b669-b2ec10285463"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.303187 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "4dd9e80f-8e99-46a6-b669-b2ec10285463" (UID: "4dd9e80f-8e99-46a6-b669-b2ec10285463"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.310294 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "4dd9e80f-8e99-46a6-b669-b2ec10285463" (UID: "4dd9e80f-8e99-46a6-b669-b2ec10285463"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.311850 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "4dd9e80f-8e99-46a6-b669-b2ec10285463" (UID: "4dd9e80f-8e99-46a6-b669-b2ec10285463"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.365816 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k27zg\" (UniqueName: \"kubernetes.io/projected/4dd9e80f-8e99-46a6-b669-b2ec10285463-kube-api-access-k27zg\") on node \"crc\" DevicePath \"\"" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.365850 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.365862 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.365873 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.365884 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.365896 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.365906 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4dd9e80f-8e99-46a6-b669-b2ec10285463-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.824157 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" event={"ID":"4dd9e80f-8e99-46a6-b669-b2ec10285463","Type":"ContainerDied","Data":"808c4494128568d25147caa138ff453f0452eccb50b936def269c0c789a741ea"} Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.824196 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="808c4494128568d25147caa138ff453f0452eccb50b936def269c0c789a741ea" Nov 25 16:42:55 crc kubenswrapper[4743]: I1125 16:42:55.824203 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.352502 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Nov 25 16:43:42 crc kubenswrapper[4743]: E1125 16:43:42.353371 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd9e80f-8e99-46a6-b669-b2ec10285463" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.353384 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd9e80f-8e99-46a6-b669-b2ec10285463" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.353549 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd9e80f-8e99-46a6-b669-b2ec10285463" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.362604 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.366065 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.366088 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-hl8vr" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.366174 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.366091 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.372968 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.434667 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47459f25-57d0-4c84-8f42-81c8698769bd-config-data\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.434784 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/47459f25-57d0-4c84-8f42-81c8698769bd-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.434840 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.536134 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/47459f25-57d0-4c84-8f42-81c8698769bd-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.536194 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47459f25-57d0-4c84-8f42-81c8698769bd-config-data\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.536252 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.536310 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.536333 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/47459f25-57d0-4c84-8f42-81c8698769bd-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.536364 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/47459f25-57d0-4c84-8f42-81c8698769bd-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.536516 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dwmd\" (UniqueName: \"kubernetes.io/projected/47459f25-57d0-4c84-8f42-81c8698769bd-kube-api-access-5dwmd\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.536610 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.536788 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.537482 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47459f25-57d0-4c84-8f42-81c8698769bd-config-data\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.537717 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/47459f25-57d0-4c84-8f42-81c8698769bd-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.542617 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.638895 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/47459f25-57d0-4c84-8f42-81c8698769bd-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.639239 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.639412 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.639521 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/47459f25-57d0-4c84-8f42-81c8698769bd-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.639687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dwmd\" (UniqueName: \"kubernetes.io/projected/47459f25-57d0-4c84-8f42-81c8698769bd-kube-api-access-5dwmd\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.639829 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/47459f25-57d0-4c84-8f42-81c8698769bd-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.639425 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/47459f25-57d0-4c84-8f42-81c8698769bd-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.639978 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.639694 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.642818 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.642905 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.654332 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dwmd\" (UniqueName: \"kubernetes.io/projected/47459f25-57d0-4c84-8f42-81c8698769bd-kube-api-access-5dwmd\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.665535 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " pod="openstack/tempest-tests-tempest" Nov 25 16:43:42 crc kubenswrapper[4743]: I1125 16:43:42.702919 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 25 16:43:43 crc kubenswrapper[4743]: I1125 16:43:43.133699 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 25 16:43:43 crc kubenswrapper[4743]: W1125 16:43:43.140095 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47459f25_57d0_4c84_8f42_81c8698769bd.slice/crio-d16cd290df212a34c9325e5963d3f2223c6f7ecacde21e14f9baabe6cbe6872b WatchSource:0}: Error finding container d16cd290df212a34c9325e5963d3f2223c6f7ecacde21e14f9baabe6cbe6872b: Status 404 returned error can't find the container with id d16cd290df212a34c9325e5963d3f2223c6f7ecacde21e14f9baabe6cbe6872b Nov 25 16:43:43 crc kubenswrapper[4743]: I1125 16:43:43.244379 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"47459f25-57d0-4c84-8f42-81c8698769bd","Type":"ContainerStarted","Data":"d16cd290df212a34c9325e5963d3f2223c6f7ecacde21e14f9baabe6cbe6872b"} Nov 25 16:43:50 crc kubenswrapper[4743]: I1125 16:43:50.077813 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:43:50 crc kubenswrapper[4743]: I1125 16:43:50.078350 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:44:20 crc kubenswrapper[4743]: I1125 16:44:20.077148 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:44:20 crc kubenswrapper[4743]: I1125 16:44:20.077663 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:44:22 crc kubenswrapper[4743]: E1125 16:44:22.992916 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Nov 25 16:44:22 crc kubenswrapper[4743]: E1125 16:44:22.994299 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5dwmd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(47459f25-57d0-4c84-8f42-81c8698769bd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 16:44:22 crc kubenswrapper[4743]: E1125 16:44:22.995625 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="47459f25-57d0-4c84-8f42-81c8698769bd" Nov 25 16:44:23 crc kubenswrapper[4743]: E1125 16:44:23.635790 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="47459f25-57d0-4c84-8f42-81c8698769bd" Nov 25 16:44:33 crc kubenswrapper[4743]: I1125 16:44:33.434822 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-69hkh"] Nov 25 16:44:33 crc kubenswrapper[4743]: I1125 16:44:33.437787 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:33 crc kubenswrapper[4743]: I1125 16:44:33.448771 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-69hkh"] Nov 25 16:44:33 crc kubenswrapper[4743]: I1125 16:44:33.521645 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6jbj\" (UniqueName: \"kubernetes.io/projected/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-kube-api-access-r6jbj\") pod \"community-operators-69hkh\" (UID: \"aac9b511-cafb-4ad6-91fd-5e1b20500d7a\") " pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:33 crc kubenswrapper[4743]: I1125 16:44:33.521758 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-catalog-content\") pod \"community-operators-69hkh\" (UID: \"aac9b511-cafb-4ad6-91fd-5e1b20500d7a\") " pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:33 crc kubenswrapper[4743]: I1125 16:44:33.521920 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-utilities\") pod \"community-operators-69hkh\" (UID: \"aac9b511-cafb-4ad6-91fd-5e1b20500d7a\") " pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:33 crc kubenswrapper[4743]: I1125 16:44:33.623665 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-catalog-content\") pod \"community-operators-69hkh\" (UID: \"aac9b511-cafb-4ad6-91fd-5e1b20500d7a\") " pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:33 crc kubenswrapper[4743]: I1125 16:44:33.623853 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-utilities\") pod \"community-operators-69hkh\" (UID: \"aac9b511-cafb-4ad6-91fd-5e1b20500d7a\") " pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:33 crc kubenswrapper[4743]: I1125 16:44:33.623956 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6jbj\" (UniqueName: \"kubernetes.io/projected/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-kube-api-access-r6jbj\") pod \"community-operators-69hkh\" (UID: \"aac9b511-cafb-4ad6-91fd-5e1b20500d7a\") " pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:33 crc kubenswrapper[4743]: I1125 16:44:33.624221 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-catalog-content\") pod \"community-operators-69hkh\" (UID: \"aac9b511-cafb-4ad6-91fd-5e1b20500d7a\") " pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:33 crc kubenswrapper[4743]: I1125 16:44:33.624222 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-utilities\") pod \"community-operators-69hkh\" (UID: \"aac9b511-cafb-4ad6-91fd-5e1b20500d7a\") " pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:33 crc kubenswrapper[4743]: I1125 16:44:33.646004 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6jbj\" (UniqueName: \"kubernetes.io/projected/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-kube-api-access-r6jbj\") pod \"community-operators-69hkh\" (UID: \"aac9b511-cafb-4ad6-91fd-5e1b20500d7a\") " pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:33 crc kubenswrapper[4743]: I1125 16:44:33.765680 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:34 crc kubenswrapper[4743]: I1125 16:44:34.262001 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-69hkh"] Nov 25 16:44:34 crc kubenswrapper[4743]: I1125 16:44:34.715028 4743 generic.go:334] "Generic (PLEG): container finished" podID="aac9b511-cafb-4ad6-91fd-5e1b20500d7a" containerID="ecf4e9a8c049ad486d576fdf54207ec1cb095e8bf20222338373fe4e8340c545" exitCode=0 Nov 25 16:44:34 crc kubenswrapper[4743]: I1125 16:44:34.715075 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69hkh" event={"ID":"aac9b511-cafb-4ad6-91fd-5e1b20500d7a","Type":"ContainerDied","Data":"ecf4e9a8c049ad486d576fdf54207ec1cb095e8bf20222338373fe4e8340c545"} Nov 25 16:44:34 crc kubenswrapper[4743]: I1125 16:44:34.715439 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69hkh" event={"ID":"aac9b511-cafb-4ad6-91fd-5e1b20500d7a","Type":"ContainerStarted","Data":"823d57859e7d2b830a3ffae9964b37ca82e3e397daccdfc8857ec54dd0505985"} Nov 25 16:44:36 crc kubenswrapper[4743]: I1125 16:44:36.731190 4743 generic.go:334] "Generic (PLEG): container finished" podID="aac9b511-cafb-4ad6-91fd-5e1b20500d7a" containerID="b523313bc394a09c227b2493dedf329272928da0696c1782814fe22ed33af3a5" exitCode=0 Nov 25 16:44:36 crc kubenswrapper[4743]: I1125 16:44:36.731290 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69hkh" event={"ID":"aac9b511-cafb-4ad6-91fd-5e1b20500d7a","Type":"ContainerDied","Data":"b523313bc394a09c227b2493dedf329272928da0696c1782814fe22ed33af3a5"} Nov 25 16:44:37 crc kubenswrapper[4743]: I1125 16:44:37.747254 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69hkh" event={"ID":"aac9b511-cafb-4ad6-91fd-5e1b20500d7a","Type":"ContainerStarted","Data":"a73e2c4d90f10e773195d3547f298ac2c0a15d320f24030bde48ee76d6237f26"} Nov 25 16:44:37 crc kubenswrapper[4743]: I1125 16:44:37.766764 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-69hkh" podStartSLOduration=2.292943676 podStartE2EDuration="4.766750002s" podCreationTimestamp="2025-11-25 16:44:33 +0000 UTC" firstStartedPulling="2025-11-25 16:44:34.71672821 +0000 UTC m=+2753.838567769" lastFinishedPulling="2025-11-25 16:44:37.190534536 +0000 UTC m=+2756.312374095" observedRunningTime="2025-11-25 16:44:37.763675996 +0000 UTC m=+2756.885515565" watchObservedRunningTime="2025-11-25 16:44:37.766750002 +0000 UTC m=+2756.888589541" Nov 25 16:44:38 crc kubenswrapper[4743]: I1125 16:44:38.760769 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"47459f25-57d0-4c84-8f42-81c8698769bd","Type":"ContainerStarted","Data":"c9ee4843fb34d5c46c99de2780836f2af5b3bd1bd993eaa0662c86aa099420c7"} Nov 25 16:44:38 crc kubenswrapper[4743]: I1125 16:44:38.796959 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.73938591 podStartE2EDuration="57.796941826s" podCreationTimestamp="2025-11-25 16:43:41 +0000 UTC" firstStartedPulling="2025-11-25 16:43:43.142546033 +0000 UTC m=+2702.264385572" lastFinishedPulling="2025-11-25 16:44:37.200101939 +0000 UTC m=+2756.321941488" observedRunningTime="2025-11-25 16:44:38.781138348 +0000 UTC m=+2757.902977897" watchObservedRunningTime="2025-11-25 16:44:38.796941826 +0000 UTC m=+2757.918781375" Nov 25 16:44:43 crc kubenswrapper[4743]: I1125 16:44:43.766379 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:43 crc kubenswrapper[4743]: I1125 16:44:43.767024 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:43 crc kubenswrapper[4743]: I1125 16:44:43.817316 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:43 crc kubenswrapper[4743]: I1125 16:44:43.871785 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:46 crc kubenswrapper[4743]: I1125 16:44:46.005652 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-69hkh"] Nov 25 16:44:46 crc kubenswrapper[4743]: I1125 16:44:46.005861 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-69hkh" podUID="aac9b511-cafb-4ad6-91fd-5e1b20500d7a" containerName="registry-server" containerID="cri-o://a73e2c4d90f10e773195d3547f298ac2c0a15d320f24030bde48ee76d6237f26" gracePeriod=2 Nov 25 16:44:46 crc kubenswrapper[4743]: I1125 16:44:46.211758 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6bblr"] Nov 25 16:44:46 crc kubenswrapper[4743]: I1125 16:44:46.214312 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:44:46 crc kubenswrapper[4743]: I1125 16:44:46.232252 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6bblr"] Nov 25 16:44:46 crc kubenswrapper[4743]: I1125 16:44:46.358478 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4xp8\" (UniqueName: \"kubernetes.io/projected/5d8866bf-8687-40c9-9f35-88c111a814a3-kube-api-access-t4xp8\") pod \"certified-operators-6bblr\" (UID: \"5d8866bf-8687-40c9-9f35-88c111a814a3\") " pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:44:46 crc kubenswrapper[4743]: I1125 16:44:46.358566 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d8866bf-8687-40c9-9f35-88c111a814a3-utilities\") pod \"certified-operators-6bblr\" (UID: \"5d8866bf-8687-40c9-9f35-88c111a814a3\") " pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:44:46 crc kubenswrapper[4743]: I1125 16:44:46.358624 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d8866bf-8687-40c9-9f35-88c111a814a3-catalog-content\") pod \"certified-operators-6bblr\" (UID: \"5d8866bf-8687-40c9-9f35-88c111a814a3\") " pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:44:46 crc kubenswrapper[4743]: I1125 16:44:46.459767 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4xp8\" (UniqueName: \"kubernetes.io/projected/5d8866bf-8687-40c9-9f35-88c111a814a3-kube-api-access-t4xp8\") pod \"certified-operators-6bblr\" (UID: \"5d8866bf-8687-40c9-9f35-88c111a814a3\") " pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:44:46 crc kubenswrapper[4743]: I1125 16:44:46.459835 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d8866bf-8687-40c9-9f35-88c111a814a3-utilities\") pod \"certified-operators-6bblr\" (UID: \"5d8866bf-8687-40c9-9f35-88c111a814a3\") " pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:44:46 crc kubenswrapper[4743]: I1125 16:44:46.459881 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d8866bf-8687-40c9-9f35-88c111a814a3-catalog-content\") pod \"certified-operators-6bblr\" (UID: \"5d8866bf-8687-40c9-9f35-88c111a814a3\") " pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:44:46 crc kubenswrapper[4743]: I1125 16:44:46.460407 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d8866bf-8687-40c9-9f35-88c111a814a3-catalog-content\") pod \"certified-operators-6bblr\" (UID: \"5d8866bf-8687-40c9-9f35-88c111a814a3\") " pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:44:46 crc kubenswrapper[4743]: I1125 16:44:46.460442 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d8866bf-8687-40c9-9f35-88c111a814a3-utilities\") pod \"certified-operators-6bblr\" (UID: \"5d8866bf-8687-40c9-9f35-88c111a814a3\") " pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:44:46 crc kubenswrapper[4743]: I1125 16:44:46.479859 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4xp8\" (UniqueName: \"kubernetes.io/projected/5d8866bf-8687-40c9-9f35-88c111a814a3-kube-api-access-t4xp8\") pod \"certified-operators-6bblr\" (UID: \"5d8866bf-8687-40c9-9f35-88c111a814a3\") " pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:44:46 crc kubenswrapper[4743]: I1125 16:44:46.536651 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:44:47 crc kubenswrapper[4743]: I1125 16:44:47.868266 4743 generic.go:334] "Generic (PLEG): container finished" podID="aac9b511-cafb-4ad6-91fd-5e1b20500d7a" containerID="a73e2c4d90f10e773195d3547f298ac2c0a15d320f24030bde48ee76d6237f26" exitCode=0 Nov 25 16:44:47 crc kubenswrapper[4743]: I1125 16:44:47.868329 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69hkh" event={"ID":"aac9b511-cafb-4ad6-91fd-5e1b20500d7a","Type":"ContainerDied","Data":"a73e2c4d90f10e773195d3547f298ac2c0a15d320f24030bde48ee76d6237f26"} Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.071301 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6bblr"] Nov 25 16:44:48 crc kubenswrapper[4743]: W1125 16:44:48.073849 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d8866bf_8687_40c9_9f35_88c111a814a3.slice/crio-ca7fd8b43500ce3c3b00f16812685deca83e05e19d01c96906199bcb42852aaa WatchSource:0}: Error finding container ca7fd8b43500ce3c3b00f16812685deca83e05e19d01c96906199bcb42852aaa: Status 404 returned error can't find the container with id ca7fd8b43500ce3c3b00f16812685deca83e05e19d01c96906199bcb42852aaa Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.205502 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.301339 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6jbj\" (UniqueName: \"kubernetes.io/projected/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-kube-api-access-r6jbj\") pod \"aac9b511-cafb-4ad6-91fd-5e1b20500d7a\" (UID: \"aac9b511-cafb-4ad6-91fd-5e1b20500d7a\") " Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.301459 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-utilities\") pod \"aac9b511-cafb-4ad6-91fd-5e1b20500d7a\" (UID: \"aac9b511-cafb-4ad6-91fd-5e1b20500d7a\") " Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.301480 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-catalog-content\") pod \"aac9b511-cafb-4ad6-91fd-5e1b20500d7a\" (UID: \"aac9b511-cafb-4ad6-91fd-5e1b20500d7a\") " Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.302525 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-utilities" (OuterVolumeSpecName: "utilities") pod "aac9b511-cafb-4ad6-91fd-5e1b20500d7a" (UID: "aac9b511-cafb-4ad6-91fd-5e1b20500d7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.317759 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-kube-api-access-r6jbj" (OuterVolumeSpecName: "kube-api-access-r6jbj") pod "aac9b511-cafb-4ad6-91fd-5e1b20500d7a" (UID: "aac9b511-cafb-4ad6-91fd-5e1b20500d7a"). InnerVolumeSpecName "kube-api-access-r6jbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.363784 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aac9b511-cafb-4ad6-91fd-5e1b20500d7a" (UID: "aac9b511-cafb-4ad6-91fd-5e1b20500d7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.404097 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.404141 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.404157 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6jbj\" (UniqueName: \"kubernetes.io/projected/aac9b511-cafb-4ad6-91fd-5e1b20500d7a-kube-api-access-r6jbj\") on node \"crc\" DevicePath \"\"" Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.881261 4743 generic.go:334] "Generic (PLEG): container finished" podID="5d8866bf-8687-40c9-9f35-88c111a814a3" containerID="6941082d449ea28f7f61843b5f898843e164a4c5b55665738de143fe5e865ad1" exitCode=0 Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.881357 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bblr" event={"ID":"5d8866bf-8687-40c9-9f35-88c111a814a3","Type":"ContainerDied","Data":"6941082d449ea28f7f61843b5f898843e164a4c5b55665738de143fe5e865ad1"} Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.881699 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bblr" event={"ID":"5d8866bf-8687-40c9-9f35-88c111a814a3","Type":"ContainerStarted","Data":"ca7fd8b43500ce3c3b00f16812685deca83e05e19d01c96906199bcb42852aaa"} Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.884075 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69hkh" event={"ID":"aac9b511-cafb-4ad6-91fd-5e1b20500d7a","Type":"ContainerDied","Data":"823d57859e7d2b830a3ffae9964b37ca82e3e397daccdfc8857ec54dd0505985"} Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.884153 4743 scope.go:117] "RemoveContainer" containerID="a73e2c4d90f10e773195d3547f298ac2c0a15d320f24030bde48ee76d6237f26" Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.884660 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69hkh" Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.917772 4743 scope.go:117] "RemoveContainer" containerID="b523313bc394a09c227b2493dedf329272928da0696c1782814fe22ed33af3a5" Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.924933 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-69hkh"] Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.933046 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-69hkh"] Nov 25 16:44:48 crc kubenswrapper[4743]: I1125 16:44:48.939889 4743 scope.go:117] "RemoveContainer" containerID="ecf4e9a8c049ad486d576fdf54207ec1cb095e8bf20222338373fe4e8340c545" Nov 25 16:44:49 crc kubenswrapper[4743]: I1125 16:44:49.784848 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aac9b511-cafb-4ad6-91fd-5e1b20500d7a" path="/var/lib/kubelet/pods/aac9b511-cafb-4ad6-91fd-5e1b20500d7a/volumes" Nov 25 16:44:49 crc kubenswrapper[4743]: I1125 16:44:49.894260 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bblr" event={"ID":"5d8866bf-8687-40c9-9f35-88c111a814a3","Type":"ContainerStarted","Data":"ec7a75bf2ff6e750cb128cbb56d826894f36f0e19503c84467e389c2f07641c5"} Nov 25 16:44:50 crc kubenswrapper[4743]: I1125 16:44:50.077689 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:44:50 crc kubenswrapper[4743]: I1125 16:44:50.077742 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:44:50 crc kubenswrapper[4743]: I1125 16:44:50.077777 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 16:44:50 crc kubenswrapper[4743]: I1125 16:44:50.078290 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ebb7d1bfffe1dc1dcdba8c833e72997ef35af50a18f0336dc5b31cea610b869"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:44:50 crc kubenswrapper[4743]: I1125 16:44:50.078354 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://8ebb7d1bfffe1dc1dcdba8c833e72997ef35af50a18f0336dc5b31cea610b869" gracePeriod=600 Nov 25 16:44:50 crc kubenswrapper[4743]: I1125 16:44:50.905298 4743 generic.go:334] "Generic (PLEG): container finished" podID="5d8866bf-8687-40c9-9f35-88c111a814a3" containerID="ec7a75bf2ff6e750cb128cbb56d826894f36f0e19503c84467e389c2f07641c5" exitCode=0 Nov 25 16:44:50 crc kubenswrapper[4743]: I1125 16:44:50.905398 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bblr" event={"ID":"5d8866bf-8687-40c9-9f35-88c111a814a3","Type":"ContainerDied","Data":"ec7a75bf2ff6e750cb128cbb56d826894f36f0e19503c84467e389c2f07641c5"} Nov 25 16:44:50 crc kubenswrapper[4743]: I1125 16:44:50.909072 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="8ebb7d1bfffe1dc1dcdba8c833e72997ef35af50a18f0336dc5b31cea610b869" exitCode=0 Nov 25 16:44:50 crc kubenswrapper[4743]: I1125 16:44:50.909116 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"8ebb7d1bfffe1dc1dcdba8c833e72997ef35af50a18f0336dc5b31cea610b869"} Nov 25 16:44:50 crc kubenswrapper[4743]: I1125 16:44:50.909149 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70"} Nov 25 16:44:50 crc kubenswrapper[4743]: I1125 16:44:50.909169 4743 scope.go:117] "RemoveContainer" containerID="4e5695500940c54ab6bbc13e3d52feba229cb16b5a2e42484f3f0fcaf8d2c964" Nov 25 16:44:51 crc kubenswrapper[4743]: I1125 16:44:51.923849 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bblr" event={"ID":"5d8866bf-8687-40c9-9f35-88c111a814a3","Type":"ContainerStarted","Data":"37fa671ddd9ad598695e07fe5c65f6c0dd5f79a5f24a0fe4ddde14618364c253"} Nov 25 16:44:51 crc kubenswrapper[4743]: I1125 16:44:51.952634 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6bblr" podStartSLOduration=3.397936828 podStartE2EDuration="5.952616797s" podCreationTimestamp="2025-11-25 16:44:46 +0000 UTC" firstStartedPulling="2025-11-25 16:44:48.883264945 +0000 UTC m=+2768.005104514" lastFinishedPulling="2025-11-25 16:44:51.437944894 +0000 UTC m=+2770.559784483" observedRunningTime="2025-11-25 16:44:51.944644906 +0000 UTC m=+2771.066484485" watchObservedRunningTime="2025-11-25 16:44:51.952616797 +0000 UTC m=+2771.074456346" Nov 25 16:44:56 crc kubenswrapper[4743]: I1125 16:44:56.537125 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:44:56 crc kubenswrapper[4743]: I1125 16:44:56.537769 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:44:56 crc kubenswrapper[4743]: I1125 16:44:56.583445 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:44:57 crc kubenswrapper[4743]: I1125 16:44:57.019312 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:44:59 crc kubenswrapper[4743]: I1125 16:44:59.605097 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6bblr"] Nov 25 16:44:59 crc kubenswrapper[4743]: I1125 16:44:59.606452 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6bblr" podUID="5d8866bf-8687-40c9-9f35-88c111a814a3" containerName="registry-server" containerID="cri-o://37fa671ddd9ad598695e07fe5c65f6c0dd5f79a5f24a0fe4ddde14618364c253" gracePeriod=2 Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.002070 4743 generic.go:334] "Generic (PLEG): container finished" podID="5d8866bf-8687-40c9-9f35-88c111a814a3" containerID="37fa671ddd9ad598695e07fe5c65f6c0dd5f79a5f24a0fe4ddde14618364c253" exitCode=0 Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.002442 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bblr" event={"ID":"5d8866bf-8687-40c9-9f35-88c111a814a3","Type":"ContainerDied","Data":"37fa671ddd9ad598695e07fe5c65f6c0dd5f79a5f24a0fe4ddde14618364c253"} Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.142749 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.148155 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g"] Nov 25 16:45:00 crc kubenswrapper[4743]: E1125 16:45:00.148570 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8866bf-8687-40c9-9f35-88c111a814a3" containerName="extract-utilities" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.148668 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8866bf-8687-40c9-9f35-88c111a814a3" containerName="extract-utilities" Nov 25 16:45:00 crc kubenswrapper[4743]: E1125 16:45:00.148691 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8866bf-8687-40c9-9f35-88c111a814a3" containerName="extract-content" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.148698 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8866bf-8687-40c9-9f35-88c111a814a3" containerName="extract-content" Nov 25 16:45:00 crc kubenswrapper[4743]: E1125 16:45:00.148736 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8866bf-8687-40c9-9f35-88c111a814a3" containerName="registry-server" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.148742 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8866bf-8687-40c9-9f35-88c111a814a3" containerName="registry-server" Nov 25 16:45:00 crc kubenswrapper[4743]: E1125 16:45:00.148757 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac9b511-cafb-4ad6-91fd-5e1b20500d7a" containerName="extract-utilities" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.148764 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac9b511-cafb-4ad6-91fd-5e1b20500d7a" containerName="extract-utilities" Nov 25 16:45:00 crc kubenswrapper[4743]: E1125 16:45:00.148775 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac9b511-cafb-4ad6-91fd-5e1b20500d7a" containerName="extract-content" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.148781 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac9b511-cafb-4ad6-91fd-5e1b20500d7a" containerName="extract-content" Nov 25 16:45:00 crc kubenswrapper[4743]: E1125 16:45:00.148792 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac9b511-cafb-4ad6-91fd-5e1b20500d7a" containerName="registry-server" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.148798 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac9b511-cafb-4ad6-91fd-5e1b20500d7a" containerName="registry-server" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.148984 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d8866bf-8687-40c9-9f35-88c111a814a3" containerName="registry-server" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.149007 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="aac9b511-cafb-4ad6-91fd-5e1b20500d7a" containerName="registry-server" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.149622 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.153307 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.153723 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.160407 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g"] Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.215236 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4xp8\" (UniqueName: \"kubernetes.io/projected/5d8866bf-8687-40c9-9f35-88c111a814a3-kube-api-access-t4xp8\") pod \"5d8866bf-8687-40c9-9f35-88c111a814a3\" (UID: \"5d8866bf-8687-40c9-9f35-88c111a814a3\") " Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.215351 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d8866bf-8687-40c9-9f35-88c111a814a3-utilities\") pod \"5d8866bf-8687-40c9-9f35-88c111a814a3\" (UID: \"5d8866bf-8687-40c9-9f35-88c111a814a3\") " Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.215380 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d8866bf-8687-40c9-9f35-88c111a814a3-catalog-content\") pod \"5d8866bf-8687-40c9-9f35-88c111a814a3\" (UID: \"5d8866bf-8687-40c9-9f35-88c111a814a3\") " Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.215679 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d639f3f7-b808-4a57-9763-6624350ea374-config-volume\") pod \"collect-profiles-29401485-dfb2g\" (UID: \"d639f3f7-b808-4a57-9763-6624350ea374\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.216015 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d8866bf-8687-40c9-9f35-88c111a814a3-utilities" (OuterVolumeSpecName: "utilities") pod "5d8866bf-8687-40c9-9f35-88c111a814a3" (UID: "5d8866bf-8687-40c9-9f35-88c111a814a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.216141 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7595x\" (UniqueName: \"kubernetes.io/projected/d639f3f7-b808-4a57-9763-6624350ea374-kube-api-access-7595x\") pod \"collect-profiles-29401485-dfb2g\" (UID: \"d639f3f7-b808-4a57-9763-6624350ea374\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.216188 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d639f3f7-b808-4a57-9763-6624350ea374-secret-volume\") pod \"collect-profiles-29401485-dfb2g\" (UID: \"d639f3f7-b808-4a57-9763-6624350ea374\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.216315 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d8866bf-8687-40c9-9f35-88c111a814a3-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.222138 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d8866bf-8687-40c9-9f35-88c111a814a3-kube-api-access-t4xp8" (OuterVolumeSpecName: "kube-api-access-t4xp8") pod "5d8866bf-8687-40c9-9f35-88c111a814a3" (UID: "5d8866bf-8687-40c9-9f35-88c111a814a3"). InnerVolumeSpecName "kube-api-access-t4xp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.259616 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d8866bf-8687-40c9-9f35-88c111a814a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d8866bf-8687-40c9-9f35-88c111a814a3" (UID: "5d8866bf-8687-40c9-9f35-88c111a814a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.318340 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7595x\" (UniqueName: \"kubernetes.io/projected/d639f3f7-b808-4a57-9763-6624350ea374-kube-api-access-7595x\") pod \"collect-profiles-29401485-dfb2g\" (UID: \"d639f3f7-b808-4a57-9763-6624350ea374\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.318395 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d639f3f7-b808-4a57-9763-6624350ea374-secret-volume\") pod \"collect-profiles-29401485-dfb2g\" (UID: \"d639f3f7-b808-4a57-9763-6624350ea374\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.318450 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d639f3f7-b808-4a57-9763-6624350ea374-config-volume\") pod \"collect-profiles-29401485-dfb2g\" (UID: \"d639f3f7-b808-4a57-9763-6624350ea374\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.318504 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4xp8\" (UniqueName: \"kubernetes.io/projected/5d8866bf-8687-40c9-9f35-88c111a814a3-kube-api-access-t4xp8\") on node \"crc\" DevicePath \"\"" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.318518 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d8866bf-8687-40c9-9f35-88c111a814a3-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.320004 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d639f3f7-b808-4a57-9763-6624350ea374-config-volume\") pod \"collect-profiles-29401485-dfb2g\" (UID: \"d639f3f7-b808-4a57-9763-6624350ea374\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.334398 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d639f3f7-b808-4a57-9763-6624350ea374-secret-volume\") pod \"collect-profiles-29401485-dfb2g\" (UID: \"d639f3f7-b808-4a57-9763-6624350ea374\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.344432 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7595x\" (UniqueName: \"kubernetes.io/projected/d639f3f7-b808-4a57-9763-6624350ea374-kube-api-access-7595x\") pod \"collect-profiles-29401485-dfb2g\" (UID: \"d639f3f7-b808-4a57-9763-6624350ea374\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.467834 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" Nov 25 16:45:00 crc kubenswrapper[4743]: I1125 16:45:00.936580 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g"] Nov 25 16:45:01 crc kubenswrapper[4743]: I1125 16:45:01.016471 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6bblr" event={"ID":"5d8866bf-8687-40c9-9f35-88c111a814a3","Type":"ContainerDied","Data":"ca7fd8b43500ce3c3b00f16812685deca83e05e19d01c96906199bcb42852aaa"} Nov 25 16:45:01 crc kubenswrapper[4743]: I1125 16:45:01.016547 4743 scope.go:117] "RemoveContainer" containerID="37fa671ddd9ad598695e07fe5c65f6c0dd5f79a5f24a0fe4ddde14618364c253" Nov 25 16:45:01 crc kubenswrapper[4743]: I1125 16:45:01.016540 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6bblr" Nov 25 16:45:01 crc kubenswrapper[4743]: I1125 16:45:01.019274 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" event={"ID":"d639f3f7-b808-4a57-9763-6624350ea374","Type":"ContainerStarted","Data":"b9ddabffee6c139ae0ea74f0d0c2708339b38d8f73d59b38a3e2ab8dcbab8641"} Nov 25 16:45:01 crc kubenswrapper[4743]: I1125 16:45:01.047744 4743 scope.go:117] "RemoveContainer" containerID="ec7a75bf2ff6e750cb128cbb56d826894f36f0e19503c84467e389c2f07641c5" Nov 25 16:45:01 crc kubenswrapper[4743]: I1125 16:45:01.076354 4743 scope.go:117] "RemoveContainer" containerID="6941082d449ea28f7f61843b5f898843e164a4c5b55665738de143fe5e865ad1" Nov 25 16:45:01 crc kubenswrapper[4743]: I1125 16:45:01.080724 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6bblr"] Nov 25 16:45:01 crc kubenswrapper[4743]: I1125 16:45:01.089822 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6bblr"] Nov 25 16:45:01 crc kubenswrapper[4743]: I1125 16:45:01.787320 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d8866bf-8687-40c9-9f35-88c111a814a3" path="/var/lib/kubelet/pods/5d8866bf-8687-40c9-9f35-88c111a814a3/volumes" Nov 25 16:45:02 crc kubenswrapper[4743]: I1125 16:45:02.031583 4743 generic.go:334] "Generic (PLEG): container finished" podID="d639f3f7-b808-4a57-9763-6624350ea374" containerID="9e64d254948d328c935070a1f2d991b09f93432f9999a0df4ec883e0917f60eb" exitCode=0 Nov 25 16:45:02 crc kubenswrapper[4743]: I1125 16:45:02.031635 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" event={"ID":"d639f3f7-b808-4a57-9763-6624350ea374","Type":"ContainerDied","Data":"9e64d254948d328c935070a1f2d991b09f93432f9999a0df4ec883e0917f60eb"} Nov 25 16:45:03 crc kubenswrapper[4743]: I1125 16:45:03.398168 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" Nov 25 16:45:03 crc kubenswrapper[4743]: I1125 16:45:03.472330 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d639f3f7-b808-4a57-9763-6624350ea374-config-volume\") pod \"d639f3f7-b808-4a57-9763-6624350ea374\" (UID: \"d639f3f7-b808-4a57-9763-6624350ea374\") " Nov 25 16:45:03 crc kubenswrapper[4743]: I1125 16:45:03.472513 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7595x\" (UniqueName: \"kubernetes.io/projected/d639f3f7-b808-4a57-9763-6624350ea374-kube-api-access-7595x\") pod \"d639f3f7-b808-4a57-9763-6624350ea374\" (UID: \"d639f3f7-b808-4a57-9763-6624350ea374\") " Nov 25 16:45:03 crc kubenswrapper[4743]: I1125 16:45:03.472550 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d639f3f7-b808-4a57-9763-6624350ea374-secret-volume\") pod \"d639f3f7-b808-4a57-9763-6624350ea374\" (UID: \"d639f3f7-b808-4a57-9763-6624350ea374\") " Nov 25 16:45:03 crc kubenswrapper[4743]: I1125 16:45:03.473010 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d639f3f7-b808-4a57-9763-6624350ea374-config-volume" (OuterVolumeSpecName: "config-volume") pod "d639f3f7-b808-4a57-9763-6624350ea374" (UID: "d639f3f7-b808-4a57-9763-6624350ea374"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:45:03 crc kubenswrapper[4743]: I1125 16:45:03.478327 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d639f3f7-b808-4a57-9763-6624350ea374-kube-api-access-7595x" (OuterVolumeSpecName: "kube-api-access-7595x") pod "d639f3f7-b808-4a57-9763-6624350ea374" (UID: "d639f3f7-b808-4a57-9763-6624350ea374"). InnerVolumeSpecName "kube-api-access-7595x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:45:03 crc kubenswrapper[4743]: I1125 16:45:03.478441 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d639f3f7-b808-4a57-9763-6624350ea374-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d639f3f7-b808-4a57-9763-6624350ea374" (UID: "d639f3f7-b808-4a57-9763-6624350ea374"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:45:03 crc kubenswrapper[4743]: I1125 16:45:03.575186 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d639f3f7-b808-4a57-9763-6624350ea374-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 16:45:03 crc kubenswrapper[4743]: I1125 16:45:03.575222 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7595x\" (UniqueName: \"kubernetes.io/projected/d639f3f7-b808-4a57-9763-6624350ea374-kube-api-access-7595x\") on node \"crc\" DevicePath \"\"" Nov 25 16:45:03 crc kubenswrapper[4743]: I1125 16:45:03.575239 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d639f3f7-b808-4a57-9763-6624350ea374-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 16:45:04 crc kubenswrapper[4743]: I1125 16:45:04.051147 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" event={"ID":"d639f3f7-b808-4a57-9763-6624350ea374","Type":"ContainerDied","Data":"b9ddabffee6c139ae0ea74f0d0c2708339b38d8f73d59b38a3e2ab8dcbab8641"} Nov 25 16:45:04 crc kubenswrapper[4743]: I1125 16:45:04.051196 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9ddabffee6c139ae0ea74f0d0c2708339b38d8f73d59b38a3e2ab8dcbab8641" Nov 25 16:45:04 crc kubenswrapper[4743]: I1125 16:45:04.051206 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401485-dfb2g" Nov 25 16:45:04 crc kubenswrapper[4743]: I1125 16:45:04.469712 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65"] Nov 25 16:45:04 crc kubenswrapper[4743]: I1125 16:45:04.476752 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401440-92l65"] Nov 25 16:45:05 crc kubenswrapper[4743]: I1125 16:45:05.792906 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01b5ca28-1828-40b6-97cf-093f8027dab3" path="/var/lib/kubelet/pods/01b5ca28-1828-40b6-97cf-093f8027dab3/volumes" Nov 25 16:45:10 crc kubenswrapper[4743]: I1125 16:45:10.699721 4743 scope.go:117] "RemoveContainer" containerID="d3e77469b1f425f9cb0765b94f6cb4a5e669d36ad636330b0b09bbcf337240d0" Nov 25 16:46:30 crc kubenswrapper[4743]: I1125 16:46:30.715064 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kpvtp"] Nov 25 16:46:30 crc kubenswrapper[4743]: E1125 16:46:30.716378 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d639f3f7-b808-4a57-9763-6624350ea374" containerName="collect-profiles" Nov 25 16:46:30 crc kubenswrapper[4743]: I1125 16:46:30.716395 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d639f3f7-b808-4a57-9763-6624350ea374" containerName="collect-profiles" Nov 25 16:46:30 crc kubenswrapper[4743]: I1125 16:46:30.716641 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d639f3f7-b808-4a57-9763-6624350ea374" containerName="collect-profiles" Nov 25 16:46:30 crc kubenswrapper[4743]: I1125 16:46:30.718401 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:30 crc kubenswrapper[4743]: I1125 16:46:30.727999 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kpvtp"] Nov 25 16:46:30 crc kubenswrapper[4743]: I1125 16:46:30.783651 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-catalog-content\") pod \"redhat-marketplace-kpvtp\" (UID: \"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72\") " pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:30 crc kubenswrapper[4743]: I1125 16:46:30.783732 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc6nr\" (UniqueName: \"kubernetes.io/projected/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-kube-api-access-zc6nr\") pod \"redhat-marketplace-kpvtp\" (UID: \"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72\") " pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:30 crc kubenswrapper[4743]: I1125 16:46:30.783767 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-utilities\") pod \"redhat-marketplace-kpvtp\" (UID: \"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72\") " pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:30 crc kubenswrapper[4743]: I1125 16:46:30.885860 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc6nr\" (UniqueName: \"kubernetes.io/projected/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-kube-api-access-zc6nr\") pod \"redhat-marketplace-kpvtp\" (UID: \"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72\") " pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:30 crc kubenswrapper[4743]: I1125 16:46:30.886185 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-utilities\") pod \"redhat-marketplace-kpvtp\" (UID: \"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72\") " pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:30 crc kubenswrapper[4743]: I1125 16:46:30.886500 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-catalog-content\") pod \"redhat-marketplace-kpvtp\" (UID: \"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72\") " pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:30 crc kubenswrapper[4743]: I1125 16:46:30.886715 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-utilities\") pod \"redhat-marketplace-kpvtp\" (UID: \"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72\") " pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:30 crc kubenswrapper[4743]: I1125 16:46:30.886868 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-catalog-content\") pod \"redhat-marketplace-kpvtp\" (UID: \"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72\") " pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:30 crc kubenswrapper[4743]: I1125 16:46:30.913686 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc6nr\" (UniqueName: \"kubernetes.io/projected/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-kube-api-access-zc6nr\") pod \"redhat-marketplace-kpvtp\" (UID: \"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72\") " pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:31 crc kubenswrapper[4743]: I1125 16:46:31.040257 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:31 crc kubenswrapper[4743]: I1125 16:46:31.483373 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kpvtp"] Nov 25 16:46:31 crc kubenswrapper[4743]: I1125 16:46:31.872778 4743 generic.go:334] "Generic (PLEG): container finished" podID="e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72" containerID="05ea0d6fc8257ed12f68afef57a7b2c96f15931941c4d4aa43db572386a9396f" exitCode=0 Nov 25 16:46:31 crc kubenswrapper[4743]: I1125 16:46:31.872917 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpvtp" event={"ID":"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72","Type":"ContainerDied","Data":"05ea0d6fc8257ed12f68afef57a7b2c96f15931941c4d4aa43db572386a9396f"} Nov 25 16:46:31 crc kubenswrapper[4743]: I1125 16:46:31.873757 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpvtp" event={"ID":"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72","Type":"ContainerStarted","Data":"0a8d94071ecd28dc26a3302f6e71ec00eb7fbf71a95bfb29d9b926f27609a6a2"} Nov 25 16:46:31 crc kubenswrapper[4743]: I1125 16:46:31.874936 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 16:46:33 crc kubenswrapper[4743]: I1125 16:46:33.895042 4743 generic.go:334] "Generic (PLEG): container finished" podID="e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72" containerID="f72977aec5b016eed7fbc8ce970549c6ae411bde80d8c23e274b4330f8591cdd" exitCode=0 Nov 25 16:46:33 crc kubenswrapper[4743]: I1125 16:46:33.895126 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpvtp" event={"ID":"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72","Type":"ContainerDied","Data":"f72977aec5b016eed7fbc8ce970549c6ae411bde80d8c23e274b4330f8591cdd"} Nov 25 16:46:34 crc kubenswrapper[4743]: I1125 16:46:34.910091 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpvtp" event={"ID":"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72","Type":"ContainerStarted","Data":"6f8305367fc0f7a479e435299d10b801687c38944562f0da0d051c9596a14c93"} Nov 25 16:46:34 crc kubenswrapper[4743]: I1125 16:46:34.938950 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kpvtp" podStartSLOduration=2.543925234 podStartE2EDuration="4.938920984s" podCreationTimestamp="2025-11-25 16:46:30 +0000 UTC" firstStartedPulling="2025-11-25 16:46:31.874696212 +0000 UTC m=+2870.996535761" lastFinishedPulling="2025-11-25 16:46:34.269691962 +0000 UTC m=+2873.391531511" observedRunningTime="2025-11-25 16:46:34.932690577 +0000 UTC m=+2874.054530166" watchObservedRunningTime="2025-11-25 16:46:34.938920984 +0000 UTC m=+2874.060760563" Nov 25 16:46:41 crc kubenswrapper[4743]: I1125 16:46:41.041276 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:41 crc kubenswrapper[4743]: I1125 16:46:41.041623 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:41 crc kubenswrapper[4743]: I1125 16:46:41.082442 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:42 crc kubenswrapper[4743]: I1125 16:46:42.023795 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:42 crc kubenswrapper[4743]: I1125 16:46:42.075258 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kpvtp"] Nov 25 16:46:43 crc kubenswrapper[4743]: I1125 16:46:43.992572 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kpvtp" podUID="e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72" containerName="registry-server" containerID="cri-o://6f8305367fc0f7a479e435299d10b801687c38944562f0da0d051c9596a14c93" gracePeriod=2 Nov 25 16:46:45 crc kubenswrapper[4743]: I1125 16:46:45.464638 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:45 crc kubenswrapper[4743]: I1125 16:46:45.567636 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-utilities\") pod \"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72\" (UID: \"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72\") " Nov 25 16:46:45 crc kubenswrapper[4743]: I1125 16:46:45.567858 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-catalog-content\") pod \"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72\" (UID: \"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72\") " Nov 25 16:46:45 crc kubenswrapper[4743]: I1125 16:46:45.567908 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc6nr\" (UniqueName: \"kubernetes.io/projected/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-kube-api-access-zc6nr\") pod \"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72\" (UID: \"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72\") " Nov 25 16:46:45 crc kubenswrapper[4743]: I1125 16:46:45.569018 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-utilities" (OuterVolumeSpecName: "utilities") pod "e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72" (UID: "e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:46:45 crc kubenswrapper[4743]: I1125 16:46:45.577302 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-kube-api-access-zc6nr" (OuterVolumeSpecName: "kube-api-access-zc6nr") pod "e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72" (UID: "e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72"). InnerVolumeSpecName "kube-api-access-zc6nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:46:45 crc kubenswrapper[4743]: I1125 16:46:45.588053 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72" (UID: "e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:46:45 crc kubenswrapper[4743]: I1125 16:46:45.670047 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:46:45 crc kubenswrapper[4743]: I1125 16:46:45.670078 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:46:45 crc kubenswrapper[4743]: I1125 16:46:45.670089 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc6nr\" (UniqueName: \"kubernetes.io/projected/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72-kube-api-access-zc6nr\") on node \"crc\" DevicePath \"\"" Nov 25 16:46:46 crc kubenswrapper[4743]: I1125 16:46:46.014648 4743 generic.go:334] "Generic (PLEG): container finished" podID="e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72" containerID="6f8305367fc0f7a479e435299d10b801687c38944562f0da0d051c9596a14c93" exitCode=0 Nov 25 16:46:46 crc kubenswrapper[4743]: I1125 16:46:46.014686 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpvtp" event={"ID":"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72","Type":"ContainerDied","Data":"6f8305367fc0f7a479e435299d10b801687c38944562f0da0d051c9596a14c93"} Nov 25 16:46:46 crc kubenswrapper[4743]: I1125 16:46:46.014711 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kpvtp" event={"ID":"e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72","Type":"ContainerDied","Data":"0a8d94071ecd28dc26a3302f6e71ec00eb7fbf71a95bfb29d9b926f27609a6a2"} Nov 25 16:46:46 crc kubenswrapper[4743]: I1125 16:46:46.014711 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kpvtp" Nov 25 16:46:46 crc kubenswrapper[4743]: I1125 16:46:46.014726 4743 scope.go:117] "RemoveContainer" containerID="6f8305367fc0f7a479e435299d10b801687c38944562f0da0d051c9596a14c93" Nov 25 16:46:46 crc kubenswrapper[4743]: I1125 16:46:46.041334 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kpvtp"] Nov 25 16:46:46 crc kubenswrapper[4743]: I1125 16:46:46.048402 4743 scope.go:117] "RemoveContainer" containerID="f72977aec5b016eed7fbc8ce970549c6ae411bde80d8c23e274b4330f8591cdd" Nov 25 16:46:46 crc kubenswrapper[4743]: I1125 16:46:46.052966 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kpvtp"] Nov 25 16:46:46 crc kubenswrapper[4743]: I1125 16:46:46.070482 4743 scope.go:117] "RemoveContainer" containerID="05ea0d6fc8257ed12f68afef57a7b2c96f15931941c4d4aa43db572386a9396f" Nov 25 16:46:46 crc kubenswrapper[4743]: I1125 16:46:46.109862 4743 scope.go:117] "RemoveContainer" containerID="6f8305367fc0f7a479e435299d10b801687c38944562f0da0d051c9596a14c93" Nov 25 16:46:46 crc kubenswrapper[4743]: E1125 16:46:46.110338 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f8305367fc0f7a479e435299d10b801687c38944562f0da0d051c9596a14c93\": container with ID starting with 6f8305367fc0f7a479e435299d10b801687c38944562f0da0d051c9596a14c93 not found: ID does not exist" containerID="6f8305367fc0f7a479e435299d10b801687c38944562f0da0d051c9596a14c93" Nov 25 16:46:46 crc kubenswrapper[4743]: I1125 16:46:46.110369 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f8305367fc0f7a479e435299d10b801687c38944562f0da0d051c9596a14c93"} err="failed to get container status \"6f8305367fc0f7a479e435299d10b801687c38944562f0da0d051c9596a14c93\": rpc error: code = NotFound desc = could not find container \"6f8305367fc0f7a479e435299d10b801687c38944562f0da0d051c9596a14c93\": container with ID starting with 6f8305367fc0f7a479e435299d10b801687c38944562f0da0d051c9596a14c93 not found: ID does not exist" Nov 25 16:46:46 crc kubenswrapper[4743]: I1125 16:46:46.110394 4743 scope.go:117] "RemoveContainer" containerID="f72977aec5b016eed7fbc8ce970549c6ae411bde80d8c23e274b4330f8591cdd" Nov 25 16:46:46 crc kubenswrapper[4743]: E1125 16:46:46.110780 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f72977aec5b016eed7fbc8ce970549c6ae411bde80d8c23e274b4330f8591cdd\": container with ID starting with f72977aec5b016eed7fbc8ce970549c6ae411bde80d8c23e274b4330f8591cdd not found: ID does not exist" containerID="f72977aec5b016eed7fbc8ce970549c6ae411bde80d8c23e274b4330f8591cdd" Nov 25 16:46:46 crc kubenswrapper[4743]: I1125 16:46:46.110802 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f72977aec5b016eed7fbc8ce970549c6ae411bde80d8c23e274b4330f8591cdd"} err="failed to get container status \"f72977aec5b016eed7fbc8ce970549c6ae411bde80d8c23e274b4330f8591cdd\": rpc error: code = NotFound desc = could not find container \"f72977aec5b016eed7fbc8ce970549c6ae411bde80d8c23e274b4330f8591cdd\": container with ID starting with f72977aec5b016eed7fbc8ce970549c6ae411bde80d8c23e274b4330f8591cdd not found: ID does not exist" Nov 25 16:46:46 crc kubenswrapper[4743]: I1125 16:46:46.110820 4743 scope.go:117] "RemoveContainer" containerID="05ea0d6fc8257ed12f68afef57a7b2c96f15931941c4d4aa43db572386a9396f" Nov 25 16:46:46 crc kubenswrapper[4743]: E1125 16:46:46.111150 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05ea0d6fc8257ed12f68afef57a7b2c96f15931941c4d4aa43db572386a9396f\": container with ID starting with 05ea0d6fc8257ed12f68afef57a7b2c96f15931941c4d4aa43db572386a9396f not found: ID does not exist" containerID="05ea0d6fc8257ed12f68afef57a7b2c96f15931941c4d4aa43db572386a9396f" Nov 25 16:46:46 crc kubenswrapper[4743]: I1125 16:46:46.111203 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05ea0d6fc8257ed12f68afef57a7b2c96f15931941c4d4aa43db572386a9396f"} err="failed to get container status \"05ea0d6fc8257ed12f68afef57a7b2c96f15931941c4d4aa43db572386a9396f\": rpc error: code = NotFound desc = could not find container \"05ea0d6fc8257ed12f68afef57a7b2c96f15931941c4d4aa43db572386a9396f\": container with ID starting with 05ea0d6fc8257ed12f68afef57a7b2c96f15931941c4d4aa43db572386a9396f not found: ID does not exist" Nov 25 16:46:47 crc kubenswrapper[4743]: I1125 16:46:47.788731 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72" path="/var/lib/kubelet/pods/e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72/volumes" Nov 25 16:46:50 crc kubenswrapper[4743]: I1125 16:46:50.078098 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:46:50 crc kubenswrapper[4743]: I1125 16:46:50.078372 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:47:20 crc kubenswrapper[4743]: I1125 16:47:20.077072 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:47:20 crc kubenswrapper[4743]: I1125 16:47:20.077678 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:47:50 crc kubenswrapper[4743]: I1125 16:47:50.077074 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:47:50 crc kubenswrapper[4743]: I1125 16:47:50.077670 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:47:50 crc kubenswrapper[4743]: I1125 16:47:50.077721 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 16:47:50 crc kubenswrapper[4743]: I1125 16:47:50.078280 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:47:50 crc kubenswrapper[4743]: I1125 16:47:50.078355 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" gracePeriod=600 Nov 25 16:47:50 crc kubenswrapper[4743]: E1125 16:47:50.196853 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:47:50 crc kubenswrapper[4743]: I1125 16:47:50.590946 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" exitCode=0 Nov 25 16:47:50 crc kubenswrapper[4743]: I1125 16:47:50.590996 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70"} Nov 25 16:47:50 crc kubenswrapper[4743]: I1125 16:47:50.591272 4743 scope.go:117] "RemoveContainer" containerID="8ebb7d1bfffe1dc1dcdba8c833e72997ef35af50a18f0336dc5b31cea610b869" Nov 25 16:47:50 crc kubenswrapper[4743]: I1125 16:47:50.592060 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:47:50 crc kubenswrapper[4743]: E1125 16:47:50.592472 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:48:05 crc kubenswrapper[4743]: I1125 16:48:05.775705 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:48:05 crc kubenswrapper[4743]: E1125 16:48:05.776705 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:48:20 crc kubenswrapper[4743]: I1125 16:48:20.776025 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:48:20 crc kubenswrapper[4743]: E1125 16:48:20.777207 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:48:34 crc kubenswrapper[4743]: I1125 16:48:34.774971 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:48:34 crc kubenswrapper[4743]: E1125 16:48:34.775985 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:48:49 crc kubenswrapper[4743]: I1125 16:48:49.775097 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:48:49 crc kubenswrapper[4743]: E1125 16:48:49.775881 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:49:01 crc kubenswrapper[4743]: I1125 16:49:01.781052 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:49:01 crc kubenswrapper[4743]: E1125 16:49:01.781983 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.690555 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dwl84"] Nov 25 16:49:02 crc kubenswrapper[4743]: E1125 16:49:02.691551 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72" containerName="extract-content" Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.691574 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72" containerName="extract-content" Nov 25 16:49:02 crc kubenswrapper[4743]: E1125 16:49:02.691603 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72" containerName="extract-utilities" Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.691610 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72" containerName="extract-utilities" Nov 25 16:49:02 crc kubenswrapper[4743]: E1125 16:49:02.691632 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72" containerName="registry-server" Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.691640 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72" containerName="registry-server" Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.691826 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85ef6c2-0b1e-4e7e-bfe7-1d6b05089c72" containerName="registry-server" Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.693130 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.728669 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dwl84"] Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.828686 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg5z6\" (UniqueName: \"kubernetes.io/projected/ed49f8a9-a2e1-447b-976b-faac061ab48a-kube-api-access-qg5z6\") pod \"redhat-operators-dwl84\" (UID: \"ed49f8a9-a2e1-447b-976b-faac061ab48a\") " pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.829137 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed49f8a9-a2e1-447b-976b-faac061ab48a-utilities\") pod \"redhat-operators-dwl84\" (UID: \"ed49f8a9-a2e1-447b-976b-faac061ab48a\") " pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.829220 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed49f8a9-a2e1-447b-976b-faac061ab48a-catalog-content\") pod \"redhat-operators-dwl84\" (UID: \"ed49f8a9-a2e1-447b-976b-faac061ab48a\") " pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.932238 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed49f8a9-a2e1-447b-976b-faac061ab48a-utilities\") pod \"redhat-operators-dwl84\" (UID: \"ed49f8a9-a2e1-447b-976b-faac061ab48a\") " pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.932468 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed49f8a9-a2e1-447b-976b-faac061ab48a-utilities\") pod \"redhat-operators-dwl84\" (UID: \"ed49f8a9-a2e1-447b-976b-faac061ab48a\") " pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.932570 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed49f8a9-a2e1-447b-976b-faac061ab48a-catalog-content\") pod \"redhat-operators-dwl84\" (UID: \"ed49f8a9-a2e1-447b-976b-faac061ab48a\") " pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.932715 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg5z6\" (UniqueName: \"kubernetes.io/projected/ed49f8a9-a2e1-447b-976b-faac061ab48a-kube-api-access-qg5z6\") pod \"redhat-operators-dwl84\" (UID: \"ed49f8a9-a2e1-447b-976b-faac061ab48a\") " pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.933058 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed49f8a9-a2e1-447b-976b-faac061ab48a-catalog-content\") pod \"redhat-operators-dwl84\" (UID: \"ed49f8a9-a2e1-447b-976b-faac061ab48a\") " pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:02 crc kubenswrapper[4743]: I1125 16:49:02.957641 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg5z6\" (UniqueName: \"kubernetes.io/projected/ed49f8a9-a2e1-447b-976b-faac061ab48a-kube-api-access-qg5z6\") pod \"redhat-operators-dwl84\" (UID: \"ed49f8a9-a2e1-447b-976b-faac061ab48a\") " pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:03 crc kubenswrapper[4743]: I1125 16:49:03.033350 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:03 crc kubenswrapper[4743]: I1125 16:49:03.472361 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dwl84"] Nov 25 16:49:04 crc kubenswrapper[4743]: I1125 16:49:04.325362 4743 generic.go:334] "Generic (PLEG): container finished" podID="ed49f8a9-a2e1-447b-976b-faac061ab48a" containerID="30de4de03f5b08f789ac8d67406be32e1b809f6c1deefc231a81ed4e073c2fd7" exitCode=0 Nov 25 16:49:04 crc kubenswrapper[4743]: I1125 16:49:04.325468 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwl84" event={"ID":"ed49f8a9-a2e1-447b-976b-faac061ab48a","Type":"ContainerDied","Data":"30de4de03f5b08f789ac8d67406be32e1b809f6c1deefc231a81ed4e073c2fd7"} Nov 25 16:49:04 crc kubenswrapper[4743]: I1125 16:49:04.326554 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwl84" event={"ID":"ed49f8a9-a2e1-447b-976b-faac061ab48a","Type":"ContainerStarted","Data":"60f0f79bf82d0da390ff1c622a1e167a3a7e7923b57c13bbe77022bad7f4225c"} Nov 25 16:49:05 crc kubenswrapper[4743]: I1125 16:49:05.337123 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwl84" event={"ID":"ed49f8a9-a2e1-447b-976b-faac061ab48a","Type":"ContainerStarted","Data":"7e0a70e3b6de8a1f93115c0e2c16a7b260725ce8683e1ffae0a5930abdcb5976"} Nov 25 16:49:06 crc kubenswrapper[4743]: I1125 16:49:06.359978 4743 generic.go:334] "Generic (PLEG): container finished" podID="ed49f8a9-a2e1-447b-976b-faac061ab48a" containerID="7e0a70e3b6de8a1f93115c0e2c16a7b260725ce8683e1ffae0a5930abdcb5976" exitCode=0 Nov 25 16:49:06 crc kubenswrapper[4743]: I1125 16:49:06.360044 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwl84" event={"ID":"ed49f8a9-a2e1-447b-976b-faac061ab48a","Type":"ContainerDied","Data":"7e0a70e3b6de8a1f93115c0e2c16a7b260725ce8683e1ffae0a5930abdcb5976"} Nov 25 16:49:09 crc kubenswrapper[4743]: I1125 16:49:09.393705 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwl84" event={"ID":"ed49f8a9-a2e1-447b-976b-faac061ab48a","Type":"ContainerStarted","Data":"59e376fe9020262552c9249e57ea30e8bd77e427cf59749d3bbbd8a4c532c69c"} Nov 25 16:49:09 crc kubenswrapper[4743]: I1125 16:49:09.424009 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dwl84" podStartSLOduration=3.15826251 podStartE2EDuration="7.423988152s" podCreationTimestamp="2025-11-25 16:49:02 +0000 UTC" firstStartedPulling="2025-11-25 16:49:04.32820814 +0000 UTC m=+3023.450047689" lastFinishedPulling="2025-11-25 16:49:08.593933772 +0000 UTC m=+3027.715773331" observedRunningTime="2025-11-25 16:49:09.415486705 +0000 UTC m=+3028.537326264" watchObservedRunningTime="2025-11-25 16:49:09.423988152 +0000 UTC m=+3028.545827701" Nov 25 16:49:13 crc kubenswrapper[4743]: I1125 16:49:13.033455 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:13 crc kubenswrapper[4743]: I1125 16:49:13.034048 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:14 crc kubenswrapper[4743]: I1125 16:49:14.085653 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dwl84" podUID="ed49f8a9-a2e1-447b-976b-faac061ab48a" containerName="registry-server" probeResult="failure" output=< Nov 25 16:49:14 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 25 16:49:14 crc kubenswrapper[4743]: > Nov 25 16:49:15 crc kubenswrapper[4743]: I1125 16:49:15.777232 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:49:15 crc kubenswrapper[4743]: E1125 16:49:15.778567 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:49:23 crc kubenswrapper[4743]: I1125 16:49:23.095314 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:23 crc kubenswrapper[4743]: I1125 16:49:23.142522 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:23 crc kubenswrapper[4743]: I1125 16:49:23.334688 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dwl84"] Nov 25 16:49:24 crc kubenswrapper[4743]: I1125 16:49:24.525040 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dwl84" podUID="ed49f8a9-a2e1-447b-976b-faac061ab48a" containerName="registry-server" containerID="cri-o://59e376fe9020262552c9249e57ea30e8bd77e427cf59749d3bbbd8a4c532c69c" gracePeriod=2 Nov 25 16:49:24 crc kubenswrapper[4743]: I1125 16:49:24.993895 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.062129 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed49f8a9-a2e1-447b-976b-faac061ab48a-utilities\") pod \"ed49f8a9-a2e1-447b-976b-faac061ab48a\" (UID: \"ed49f8a9-a2e1-447b-976b-faac061ab48a\") " Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.062334 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z6\" (UniqueName: \"kubernetes.io/projected/ed49f8a9-a2e1-447b-976b-faac061ab48a-kube-api-access-qg5z6\") pod \"ed49f8a9-a2e1-447b-976b-faac061ab48a\" (UID: \"ed49f8a9-a2e1-447b-976b-faac061ab48a\") " Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.062381 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed49f8a9-a2e1-447b-976b-faac061ab48a-catalog-content\") pod \"ed49f8a9-a2e1-447b-976b-faac061ab48a\" (UID: \"ed49f8a9-a2e1-447b-976b-faac061ab48a\") " Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.063110 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed49f8a9-a2e1-447b-976b-faac061ab48a-utilities" (OuterVolumeSpecName: "utilities") pod "ed49f8a9-a2e1-447b-976b-faac061ab48a" (UID: "ed49f8a9-a2e1-447b-976b-faac061ab48a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.071224 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed49f8a9-a2e1-447b-976b-faac061ab48a-kube-api-access-qg5z6" (OuterVolumeSpecName: "kube-api-access-qg5z6") pod "ed49f8a9-a2e1-447b-976b-faac061ab48a" (UID: "ed49f8a9-a2e1-447b-976b-faac061ab48a"). InnerVolumeSpecName "kube-api-access-qg5z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.153627 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed49f8a9-a2e1-447b-976b-faac061ab48a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed49f8a9-a2e1-447b-976b-faac061ab48a" (UID: "ed49f8a9-a2e1-447b-976b-faac061ab48a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.164930 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z6\" (UniqueName: \"kubernetes.io/projected/ed49f8a9-a2e1-447b-976b-faac061ab48a-kube-api-access-qg5z6\") on node \"crc\" DevicePath \"\"" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.165135 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed49f8a9-a2e1-447b-976b-faac061ab48a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.165209 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed49f8a9-a2e1-447b-976b-faac061ab48a-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.537454 4743 generic.go:334] "Generic (PLEG): container finished" podID="ed49f8a9-a2e1-447b-976b-faac061ab48a" containerID="59e376fe9020262552c9249e57ea30e8bd77e427cf59749d3bbbd8a4c532c69c" exitCode=0 Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.537526 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwl84" event={"ID":"ed49f8a9-a2e1-447b-976b-faac061ab48a","Type":"ContainerDied","Data":"59e376fe9020262552c9249e57ea30e8bd77e427cf59749d3bbbd8a4c532c69c"} Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.537570 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwl84" event={"ID":"ed49f8a9-a2e1-447b-976b-faac061ab48a","Type":"ContainerDied","Data":"60f0f79bf82d0da390ff1c622a1e167a3a7e7923b57c13bbe77022bad7f4225c"} Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.537613 4743 scope.go:117] "RemoveContainer" containerID="59e376fe9020262552c9249e57ea30e8bd77e427cf59749d3bbbd8a4c532c69c" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.538226 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwl84" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.567499 4743 scope.go:117] "RemoveContainer" containerID="7e0a70e3b6de8a1f93115c0e2c16a7b260725ce8683e1ffae0a5930abdcb5976" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.581743 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dwl84"] Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.592047 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dwl84"] Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.610395 4743 scope.go:117] "RemoveContainer" containerID="30de4de03f5b08f789ac8d67406be32e1b809f6c1deefc231a81ed4e073c2fd7" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.642774 4743 scope.go:117] "RemoveContainer" containerID="59e376fe9020262552c9249e57ea30e8bd77e427cf59749d3bbbd8a4c532c69c" Nov 25 16:49:25 crc kubenswrapper[4743]: E1125 16:49:25.643219 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59e376fe9020262552c9249e57ea30e8bd77e427cf59749d3bbbd8a4c532c69c\": container with ID starting with 59e376fe9020262552c9249e57ea30e8bd77e427cf59749d3bbbd8a4c532c69c not found: ID does not exist" containerID="59e376fe9020262552c9249e57ea30e8bd77e427cf59749d3bbbd8a4c532c69c" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.643262 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59e376fe9020262552c9249e57ea30e8bd77e427cf59749d3bbbd8a4c532c69c"} err="failed to get container status \"59e376fe9020262552c9249e57ea30e8bd77e427cf59749d3bbbd8a4c532c69c\": rpc error: code = NotFound desc = could not find container \"59e376fe9020262552c9249e57ea30e8bd77e427cf59749d3bbbd8a4c532c69c\": container with ID starting with 59e376fe9020262552c9249e57ea30e8bd77e427cf59749d3bbbd8a4c532c69c not found: ID does not exist" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.643288 4743 scope.go:117] "RemoveContainer" containerID="7e0a70e3b6de8a1f93115c0e2c16a7b260725ce8683e1ffae0a5930abdcb5976" Nov 25 16:49:25 crc kubenswrapper[4743]: E1125 16:49:25.643510 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e0a70e3b6de8a1f93115c0e2c16a7b260725ce8683e1ffae0a5930abdcb5976\": container with ID starting with 7e0a70e3b6de8a1f93115c0e2c16a7b260725ce8683e1ffae0a5930abdcb5976 not found: ID does not exist" containerID="7e0a70e3b6de8a1f93115c0e2c16a7b260725ce8683e1ffae0a5930abdcb5976" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.643541 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e0a70e3b6de8a1f93115c0e2c16a7b260725ce8683e1ffae0a5930abdcb5976"} err="failed to get container status \"7e0a70e3b6de8a1f93115c0e2c16a7b260725ce8683e1ffae0a5930abdcb5976\": rpc error: code = NotFound desc = could not find container \"7e0a70e3b6de8a1f93115c0e2c16a7b260725ce8683e1ffae0a5930abdcb5976\": container with ID starting with 7e0a70e3b6de8a1f93115c0e2c16a7b260725ce8683e1ffae0a5930abdcb5976 not found: ID does not exist" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.643561 4743 scope.go:117] "RemoveContainer" containerID="30de4de03f5b08f789ac8d67406be32e1b809f6c1deefc231a81ed4e073c2fd7" Nov 25 16:49:25 crc kubenswrapper[4743]: E1125 16:49:25.643848 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30de4de03f5b08f789ac8d67406be32e1b809f6c1deefc231a81ed4e073c2fd7\": container with ID starting with 30de4de03f5b08f789ac8d67406be32e1b809f6c1deefc231a81ed4e073c2fd7 not found: ID does not exist" containerID="30de4de03f5b08f789ac8d67406be32e1b809f6c1deefc231a81ed4e073c2fd7" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.643879 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30de4de03f5b08f789ac8d67406be32e1b809f6c1deefc231a81ed4e073c2fd7"} err="failed to get container status \"30de4de03f5b08f789ac8d67406be32e1b809f6c1deefc231a81ed4e073c2fd7\": rpc error: code = NotFound desc = could not find container \"30de4de03f5b08f789ac8d67406be32e1b809f6c1deefc231a81ed4e073c2fd7\": container with ID starting with 30de4de03f5b08f789ac8d67406be32e1b809f6c1deefc231a81ed4e073c2fd7 not found: ID does not exist" Nov 25 16:49:25 crc kubenswrapper[4743]: I1125 16:49:25.785243 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed49f8a9-a2e1-447b-976b-faac061ab48a" path="/var/lib/kubelet/pods/ed49f8a9-a2e1-447b-976b-faac061ab48a/volumes" Nov 25 16:49:30 crc kubenswrapper[4743]: I1125 16:49:30.775399 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:49:30 crc kubenswrapper[4743]: E1125 16:49:30.776286 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:49:44 crc kubenswrapper[4743]: I1125 16:49:44.775743 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:49:44 crc kubenswrapper[4743]: E1125 16:49:44.777097 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:49:55 crc kubenswrapper[4743]: I1125 16:49:55.776095 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:49:55 crc kubenswrapper[4743]: E1125 16:49:55.777032 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:50:07 crc kubenswrapper[4743]: I1125 16:50:07.774804 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:50:07 crc kubenswrapper[4743]: E1125 16:50:07.775666 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:50:22 crc kubenswrapper[4743]: I1125 16:50:22.775721 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:50:22 crc kubenswrapper[4743]: E1125 16:50:22.778251 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:50:33 crc kubenswrapper[4743]: I1125 16:50:33.775403 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:50:33 crc kubenswrapper[4743]: E1125 16:50:33.776651 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:50:48 crc kubenswrapper[4743]: I1125 16:50:48.775936 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:50:48 crc kubenswrapper[4743]: E1125 16:50:48.776894 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:51:01 crc kubenswrapper[4743]: I1125 16:51:01.781117 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:51:01 crc kubenswrapper[4743]: E1125 16:51:01.781959 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:51:16 crc kubenswrapper[4743]: I1125 16:51:16.775395 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:51:16 crc kubenswrapper[4743]: E1125 16:51:16.776125 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:51:27 crc kubenswrapper[4743]: I1125 16:51:27.774705 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:51:27 crc kubenswrapper[4743]: E1125 16:51:27.775628 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:51:41 crc kubenswrapper[4743]: I1125 16:51:41.781343 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:51:41 crc kubenswrapper[4743]: E1125 16:51:41.782007 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:51:52 crc kubenswrapper[4743]: I1125 16:51:52.775155 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:51:52 crc kubenswrapper[4743]: E1125 16:51:52.775952 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:52:06 crc kubenswrapper[4743]: I1125 16:52:06.775522 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:52:06 crc kubenswrapper[4743]: E1125 16:52:06.777381 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:52:20 crc kubenswrapper[4743]: I1125 16:52:20.775374 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:52:20 crc kubenswrapper[4743]: E1125 16:52:20.776263 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:52:35 crc kubenswrapper[4743]: I1125 16:52:35.774807 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:52:35 crc kubenswrapper[4743]: E1125 16:52:35.775714 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:52:46 crc kubenswrapper[4743]: I1125 16:52:46.775190 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:52:46 crc kubenswrapper[4743]: E1125 16:52:46.776011 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:53:00 crc kubenswrapper[4743]: I1125 16:53:00.775882 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:53:01 crc kubenswrapper[4743]: I1125 16:53:01.462729 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"5b75d2571f6e149112ed55aa52e02a74966ac42a332b403fa4f32ad757b2ef8c"} Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.128340 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2v4gg"] Nov 25 16:55:16 crc kubenswrapper[4743]: E1125 16:55:16.129450 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed49f8a9-a2e1-447b-976b-faac061ab48a" containerName="extract-utilities" Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.129465 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed49f8a9-a2e1-447b-976b-faac061ab48a" containerName="extract-utilities" Nov 25 16:55:16 crc kubenswrapper[4743]: E1125 16:55:16.129495 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed49f8a9-a2e1-447b-976b-faac061ab48a" containerName="registry-server" Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.129501 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed49f8a9-a2e1-447b-976b-faac061ab48a" containerName="registry-server" Nov 25 16:55:16 crc kubenswrapper[4743]: E1125 16:55:16.129526 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed49f8a9-a2e1-447b-976b-faac061ab48a" containerName="extract-content" Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.129533 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed49f8a9-a2e1-447b-976b-faac061ab48a" containerName="extract-content" Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.129731 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed49f8a9-a2e1-447b-976b-faac061ab48a" containerName="registry-server" Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.130984 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.146297 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2v4gg"] Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.318395 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb340090-5f76-4162-aa45-cb297d9067b2-utilities\") pod \"certified-operators-2v4gg\" (UID: \"bb340090-5f76-4162-aa45-cb297d9067b2\") " pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.318484 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb340090-5f76-4162-aa45-cb297d9067b2-catalog-content\") pod \"certified-operators-2v4gg\" (UID: \"bb340090-5f76-4162-aa45-cb297d9067b2\") " pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.318724 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tq2b\" (UniqueName: \"kubernetes.io/projected/bb340090-5f76-4162-aa45-cb297d9067b2-kube-api-access-6tq2b\") pod \"certified-operators-2v4gg\" (UID: \"bb340090-5f76-4162-aa45-cb297d9067b2\") " pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.420449 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb340090-5f76-4162-aa45-cb297d9067b2-utilities\") pod \"certified-operators-2v4gg\" (UID: \"bb340090-5f76-4162-aa45-cb297d9067b2\") " pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.420548 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb340090-5f76-4162-aa45-cb297d9067b2-catalog-content\") pod \"certified-operators-2v4gg\" (UID: \"bb340090-5f76-4162-aa45-cb297d9067b2\") " pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.420633 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tq2b\" (UniqueName: \"kubernetes.io/projected/bb340090-5f76-4162-aa45-cb297d9067b2-kube-api-access-6tq2b\") pod \"certified-operators-2v4gg\" (UID: \"bb340090-5f76-4162-aa45-cb297d9067b2\") " pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.420996 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb340090-5f76-4162-aa45-cb297d9067b2-utilities\") pod \"certified-operators-2v4gg\" (UID: \"bb340090-5f76-4162-aa45-cb297d9067b2\") " pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.421052 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb340090-5f76-4162-aa45-cb297d9067b2-catalog-content\") pod \"certified-operators-2v4gg\" (UID: \"bb340090-5f76-4162-aa45-cb297d9067b2\") " pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.447721 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tq2b\" (UniqueName: \"kubernetes.io/projected/bb340090-5f76-4162-aa45-cb297d9067b2-kube-api-access-6tq2b\") pod \"certified-operators-2v4gg\" (UID: \"bb340090-5f76-4162-aa45-cb297d9067b2\") " pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:16 crc kubenswrapper[4743]: I1125 16:55:16.449545 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:17 crc kubenswrapper[4743]: I1125 16:55:16.998833 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2v4gg"] Nov 25 16:55:17 crc kubenswrapper[4743]: W1125 16:55:17.004276 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb340090_5f76_4162_aa45_cb297d9067b2.slice/crio-f6048780793aef1890548e6f37af0d6a822abdfef047f7c9ac9c847d99fb5895 WatchSource:0}: Error finding container f6048780793aef1890548e6f37af0d6a822abdfef047f7c9ac9c847d99fb5895: Status 404 returned error can't find the container with id f6048780793aef1890548e6f37af0d6a822abdfef047f7c9ac9c847d99fb5895 Nov 25 16:55:17 crc kubenswrapper[4743]: I1125 16:55:17.665955 4743 generic.go:334] "Generic (PLEG): container finished" podID="bb340090-5f76-4162-aa45-cb297d9067b2" containerID="e0050e257db18ed7964c13e485518f77ec5ad717ec7e7b9714a2b0b2eb12d255" exitCode=0 Nov 25 16:55:17 crc kubenswrapper[4743]: I1125 16:55:17.666015 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v4gg" event={"ID":"bb340090-5f76-4162-aa45-cb297d9067b2","Type":"ContainerDied","Data":"e0050e257db18ed7964c13e485518f77ec5ad717ec7e7b9714a2b0b2eb12d255"} Nov 25 16:55:17 crc kubenswrapper[4743]: I1125 16:55:17.666518 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v4gg" event={"ID":"bb340090-5f76-4162-aa45-cb297d9067b2","Type":"ContainerStarted","Data":"f6048780793aef1890548e6f37af0d6a822abdfef047f7c9ac9c847d99fb5895"} Nov 25 16:55:17 crc kubenswrapper[4743]: I1125 16:55:17.668711 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 16:55:18 crc kubenswrapper[4743]: I1125 16:55:18.677474 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v4gg" event={"ID":"bb340090-5f76-4162-aa45-cb297d9067b2","Type":"ContainerStarted","Data":"afb047db7e24bcbf40add1343f2a81bc756ca9c0fcab3fa1e5452bd08f402c0b"} Nov 25 16:55:19 crc kubenswrapper[4743]: I1125 16:55:19.687989 4743 generic.go:334] "Generic (PLEG): container finished" podID="bb340090-5f76-4162-aa45-cb297d9067b2" containerID="afb047db7e24bcbf40add1343f2a81bc756ca9c0fcab3fa1e5452bd08f402c0b" exitCode=0 Nov 25 16:55:19 crc kubenswrapper[4743]: I1125 16:55:19.688043 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v4gg" event={"ID":"bb340090-5f76-4162-aa45-cb297d9067b2","Type":"ContainerDied","Data":"afb047db7e24bcbf40add1343f2a81bc756ca9c0fcab3fa1e5452bd08f402c0b"} Nov 25 16:55:20 crc kubenswrapper[4743]: I1125 16:55:20.077427 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:55:20 crc kubenswrapper[4743]: I1125 16:55:20.077486 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:55:20 crc kubenswrapper[4743]: I1125 16:55:20.697927 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v4gg" event={"ID":"bb340090-5f76-4162-aa45-cb297d9067b2","Type":"ContainerStarted","Data":"74c37c2dd696ef8f3db3caf748287218eee4621a4ddbb6a1f9bb7b305f738029"} Nov 25 16:55:20 crc kubenswrapper[4743]: I1125 16:55:20.737488 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2v4gg" podStartSLOduration=2.230972267 podStartE2EDuration="4.737464435s" podCreationTimestamp="2025-11-25 16:55:16 +0000 UTC" firstStartedPulling="2025-11-25 16:55:17.667993234 +0000 UTC m=+3396.789832783" lastFinishedPulling="2025-11-25 16:55:20.174485402 +0000 UTC m=+3399.296324951" observedRunningTime="2025-11-25 16:55:20.717566439 +0000 UTC m=+3399.839405998" watchObservedRunningTime="2025-11-25 16:55:20.737464435 +0000 UTC m=+3399.859303984" Nov 25 16:55:26 crc kubenswrapper[4743]: I1125 16:55:26.450419 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:26 crc kubenswrapper[4743]: I1125 16:55:26.450751 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:26 crc kubenswrapper[4743]: I1125 16:55:26.519447 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:26 crc kubenswrapper[4743]: I1125 16:55:26.801847 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:26 crc kubenswrapper[4743]: I1125 16:55:26.852578 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2v4gg"] Nov 25 16:55:28 crc kubenswrapper[4743]: I1125 16:55:28.766690 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2v4gg" podUID="bb340090-5f76-4162-aa45-cb297d9067b2" containerName="registry-server" containerID="cri-o://74c37c2dd696ef8f3db3caf748287218eee4621a4ddbb6a1f9bb7b305f738029" gracePeriod=2 Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.240809 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.359345 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb340090-5f76-4162-aa45-cb297d9067b2-catalog-content\") pod \"bb340090-5f76-4162-aa45-cb297d9067b2\" (UID: \"bb340090-5f76-4162-aa45-cb297d9067b2\") " Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.359431 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb340090-5f76-4162-aa45-cb297d9067b2-utilities\") pod \"bb340090-5f76-4162-aa45-cb297d9067b2\" (UID: \"bb340090-5f76-4162-aa45-cb297d9067b2\") " Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.359625 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tq2b\" (UniqueName: \"kubernetes.io/projected/bb340090-5f76-4162-aa45-cb297d9067b2-kube-api-access-6tq2b\") pod \"bb340090-5f76-4162-aa45-cb297d9067b2\" (UID: \"bb340090-5f76-4162-aa45-cb297d9067b2\") " Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.361079 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb340090-5f76-4162-aa45-cb297d9067b2-utilities" (OuterVolumeSpecName: "utilities") pod "bb340090-5f76-4162-aa45-cb297d9067b2" (UID: "bb340090-5f76-4162-aa45-cb297d9067b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.365781 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb340090-5f76-4162-aa45-cb297d9067b2-kube-api-access-6tq2b" (OuterVolumeSpecName: "kube-api-access-6tq2b") pod "bb340090-5f76-4162-aa45-cb297d9067b2" (UID: "bb340090-5f76-4162-aa45-cb297d9067b2"). InnerVolumeSpecName "kube-api-access-6tq2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.401852 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb340090-5f76-4162-aa45-cb297d9067b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb340090-5f76-4162-aa45-cb297d9067b2" (UID: "bb340090-5f76-4162-aa45-cb297d9067b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.461707 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb340090-5f76-4162-aa45-cb297d9067b2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.461747 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb340090-5f76-4162-aa45-cb297d9067b2-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.461766 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tq2b\" (UniqueName: \"kubernetes.io/projected/bb340090-5f76-4162-aa45-cb297d9067b2-kube-api-access-6tq2b\") on node \"crc\" DevicePath \"\"" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.777734 4743 generic.go:334] "Generic (PLEG): container finished" podID="bb340090-5f76-4162-aa45-cb297d9067b2" containerID="74c37c2dd696ef8f3db3caf748287218eee4621a4ddbb6a1f9bb7b305f738029" exitCode=0 Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.777807 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2v4gg" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.784854 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v4gg" event={"ID":"bb340090-5f76-4162-aa45-cb297d9067b2","Type":"ContainerDied","Data":"74c37c2dd696ef8f3db3caf748287218eee4621a4ddbb6a1f9bb7b305f738029"} Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.784895 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v4gg" event={"ID":"bb340090-5f76-4162-aa45-cb297d9067b2","Type":"ContainerDied","Data":"f6048780793aef1890548e6f37af0d6a822abdfef047f7c9ac9c847d99fb5895"} Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.784913 4743 scope.go:117] "RemoveContainer" containerID="74c37c2dd696ef8f3db3caf748287218eee4621a4ddbb6a1f9bb7b305f738029" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.823888 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2v4gg"] Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.836014 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2v4gg"] Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.837260 4743 scope.go:117] "RemoveContainer" containerID="afb047db7e24bcbf40add1343f2a81bc756ca9c0fcab3fa1e5452bd08f402c0b" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.859154 4743 scope.go:117] "RemoveContainer" containerID="e0050e257db18ed7964c13e485518f77ec5ad717ec7e7b9714a2b0b2eb12d255" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.902525 4743 scope.go:117] "RemoveContainer" containerID="74c37c2dd696ef8f3db3caf748287218eee4621a4ddbb6a1f9bb7b305f738029" Nov 25 16:55:29 crc kubenswrapper[4743]: E1125 16:55:29.903106 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c37c2dd696ef8f3db3caf748287218eee4621a4ddbb6a1f9bb7b305f738029\": container with ID starting with 74c37c2dd696ef8f3db3caf748287218eee4621a4ddbb6a1f9bb7b305f738029 not found: ID does not exist" containerID="74c37c2dd696ef8f3db3caf748287218eee4621a4ddbb6a1f9bb7b305f738029" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.903155 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c37c2dd696ef8f3db3caf748287218eee4621a4ddbb6a1f9bb7b305f738029"} err="failed to get container status \"74c37c2dd696ef8f3db3caf748287218eee4621a4ddbb6a1f9bb7b305f738029\": rpc error: code = NotFound desc = could not find container \"74c37c2dd696ef8f3db3caf748287218eee4621a4ddbb6a1f9bb7b305f738029\": container with ID starting with 74c37c2dd696ef8f3db3caf748287218eee4621a4ddbb6a1f9bb7b305f738029 not found: ID does not exist" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.903186 4743 scope.go:117] "RemoveContainer" containerID="afb047db7e24bcbf40add1343f2a81bc756ca9c0fcab3fa1e5452bd08f402c0b" Nov 25 16:55:29 crc kubenswrapper[4743]: E1125 16:55:29.903650 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afb047db7e24bcbf40add1343f2a81bc756ca9c0fcab3fa1e5452bd08f402c0b\": container with ID starting with afb047db7e24bcbf40add1343f2a81bc756ca9c0fcab3fa1e5452bd08f402c0b not found: ID does not exist" containerID="afb047db7e24bcbf40add1343f2a81bc756ca9c0fcab3fa1e5452bd08f402c0b" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.903676 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb047db7e24bcbf40add1343f2a81bc756ca9c0fcab3fa1e5452bd08f402c0b"} err="failed to get container status \"afb047db7e24bcbf40add1343f2a81bc756ca9c0fcab3fa1e5452bd08f402c0b\": rpc error: code = NotFound desc = could not find container \"afb047db7e24bcbf40add1343f2a81bc756ca9c0fcab3fa1e5452bd08f402c0b\": container with ID starting with afb047db7e24bcbf40add1343f2a81bc756ca9c0fcab3fa1e5452bd08f402c0b not found: ID does not exist" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.903697 4743 scope.go:117] "RemoveContainer" containerID="e0050e257db18ed7964c13e485518f77ec5ad717ec7e7b9714a2b0b2eb12d255" Nov 25 16:55:29 crc kubenswrapper[4743]: E1125 16:55:29.903951 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0050e257db18ed7964c13e485518f77ec5ad717ec7e7b9714a2b0b2eb12d255\": container with ID starting with e0050e257db18ed7964c13e485518f77ec5ad717ec7e7b9714a2b0b2eb12d255 not found: ID does not exist" containerID="e0050e257db18ed7964c13e485518f77ec5ad717ec7e7b9714a2b0b2eb12d255" Nov 25 16:55:29 crc kubenswrapper[4743]: I1125 16:55:29.903981 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0050e257db18ed7964c13e485518f77ec5ad717ec7e7b9714a2b0b2eb12d255"} err="failed to get container status \"e0050e257db18ed7964c13e485518f77ec5ad717ec7e7b9714a2b0b2eb12d255\": rpc error: code = NotFound desc = could not find container \"e0050e257db18ed7964c13e485518f77ec5ad717ec7e7b9714a2b0b2eb12d255\": container with ID starting with e0050e257db18ed7964c13e485518f77ec5ad717ec7e7b9714a2b0b2eb12d255 not found: ID does not exist" Nov 25 16:55:31 crc kubenswrapper[4743]: I1125 16:55:31.786332 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb340090-5f76-4162-aa45-cb297d9067b2" path="/var/lib/kubelet/pods/bb340090-5f76-4162-aa45-cb297d9067b2/volumes" Nov 25 16:55:50 crc kubenswrapper[4743]: I1125 16:55:50.077882 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:55:50 crc kubenswrapper[4743]: I1125 16:55:50.078731 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:56:02 crc kubenswrapper[4743]: I1125 16:56:02.073465 4743 generic.go:334] "Generic (PLEG): container finished" podID="47459f25-57d0-4c84-8f42-81c8698769bd" containerID="c9ee4843fb34d5c46c99de2780836f2af5b3bd1bd993eaa0662c86aa099420c7" exitCode=0 Nov 25 16:56:02 crc kubenswrapper[4743]: I1125 16:56:02.074089 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"47459f25-57d0-4c84-8f42-81c8698769bd","Type":"ContainerDied","Data":"c9ee4843fb34d5c46c99de2780836f2af5b3bd1bd993eaa0662c86aa099420c7"} Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.415621 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.499851 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-ca-certs\") pod \"47459f25-57d0-4c84-8f42-81c8698769bd\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.500127 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dwmd\" (UniqueName: \"kubernetes.io/projected/47459f25-57d0-4c84-8f42-81c8698769bd-kube-api-access-5dwmd\") pod \"47459f25-57d0-4c84-8f42-81c8698769bd\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.500926 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47459f25-57d0-4c84-8f42-81c8698769bd-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "47459f25-57d0-4c84-8f42-81c8698769bd" (UID: "47459f25-57d0-4c84-8f42-81c8698769bd"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.501271 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/47459f25-57d0-4c84-8f42-81c8698769bd-test-operator-ephemeral-temporary\") pod \"47459f25-57d0-4c84-8f42-81c8698769bd\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.501349 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-ssh-key\") pod \"47459f25-57d0-4c84-8f42-81c8698769bd\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.501462 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47459f25-57d0-4c84-8f42-81c8698769bd-config-data\") pod \"47459f25-57d0-4c84-8f42-81c8698769bd\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.501609 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/47459f25-57d0-4c84-8f42-81c8698769bd-openstack-config\") pod \"47459f25-57d0-4c84-8f42-81c8698769bd\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.501718 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-openstack-config-secret\") pod \"47459f25-57d0-4c84-8f42-81c8698769bd\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.501773 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/47459f25-57d0-4c84-8f42-81c8698769bd-test-operator-ephemeral-workdir\") pod \"47459f25-57d0-4c84-8f42-81c8698769bd\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.501820 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"47459f25-57d0-4c84-8f42-81c8698769bd\" (UID: \"47459f25-57d0-4c84-8f42-81c8698769bd\") " Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.502651 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47459f25-57d0-4c84-8f42-81c8698769bd-config-data" (OuterVolumeSpecName: "config-data") pod "47459f25-57d0-4c84-8f42-81c8698769bd" (UID: "47459f25-57d0-4c84-8f42-81c8698769bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.502898 4743 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/47459f25-57d0-4c84-8f42-81c8698769bd-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.502922 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/47459f25-57d0-4c84-8f42-81c8698769bd-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.508260 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47459f25-57d0-4c84-8f42-81c8698769bd-kube-api-access-5dwmd" (OuterVolumeSpecName: "kube-api-access-5dwmd") pod "47459f25-57d0-4c84-8f42-81c8698769bd" (UID: "47459f25-57d0-4c84-8f42-81c8698769bd"). InnerVolumeSpecName "kube-api-access-5dwmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.508329 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "test-operator-logs") pod "47459f25-57d0-4c84-8f42-81c8698769bd" (UID: "47459f25-57d0-4c84-8f42-81c8698769bd"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.510460 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47459f25-57d0-4c84-8f42-81c8698769bd-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "47459f25-57d0-4c84-8f42-81c8698769bd" (UID: "47459f25-57d0-4c84-8f42-81c8698769bd"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.534865 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "47459f25-57d0-4c84-8f42-81c8698769bd" (UID: "47459f25-57d0-4c84-8f42-81c8698769bd"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.535907 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "47459f25-57d0-4c84-8f42-81c8698769bd" (UID: "47459f25-57d0-4c84-8f42-81c8698769bd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.536407 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "47459f25-57d0-4c84-8f42-81c8698769bd" (UID: "47459f25-57d0-4c84-8f42-81c8698769bd"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.552768 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47459f25-57d0-4c84-8f42-81c8698769bd-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "47459f25-57d0-4c84-8f42-81c8698769bd" (UID: "47459f25-57d0-4c84-8f42-81c8698769bd"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.606767 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.606809 4743 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/47459f25-57d0-4c84-8f42-81c8698769bd-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.606853 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.606872 4743 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-ca-certs\") on node \"crc\" DevicePath \"\"" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.606886 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dwmd\" (UniqueName: \"kubernetes.io/projected/47459f25-57d0-4c84-8f42-81c8698769bd-kube-api-access-5dwmd\") on node \"crc\" DevicePath \"\"" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.606898 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47459f25-57d0-4c84-8f42-81c8698769bd-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.606909 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/47459f25-57d0-4c84-8f42-81c8698769bd-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.633810 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 25 16:56:03 crc kubenswrapper[4743]: I1125 16:56:03.708739 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 25 16:56:04 crc kubenswrapper[4743]: I1125 16:56:04.090486 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"47459f25-57d0-4c84-8f42-81c8698769bd","Type":"ContainerDied","Data":"d16cd290df212a34c9325e5963d3f2223c6f7ecacde21e14f9baabe6cbe6872b"} Nov 25 16:56:04 crc kubenswrapper[4743]: I1125 16:56:04.090535 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d16cd290df212a34c9325e5963d3f2223c6f7ecacde21e14f9baabe6cbe6872b" Nov 25 16:56:04 crc kubenswrapper[4743]: I1125 16:56:04.090573 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.651449 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 25 16:56:10 crc kubenswrapper[4743]: E1125 16:56:10.653449 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47459f25-57d0-4c84-8f42-81c8698769bd" containerName="tempest-tests-tempest-tests-runner" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.654253 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="47459f25-57d0-4c84-8f42-81c8698769bd" containerName="tempest-tests-tempest-tests-runner" Nov 25 16:56:10 crc kubenswrapper[4743]: E1125 16:56:10.654337 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb340090-5f76-4162-aa45-cb297d9067b2" containerName="extract-utilities" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.654345 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb340090-5f76-4162-aa45-cb297d9067b2" containerName="extract-utilities" Nov 25 16:56:10 crc kubenswrapper[4743]: E1125 16:56:10.654391 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb340090-5f76-4162-aa45-cb297d9067b2" containerName="registry-server" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.654398 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb340090-5f76-4162-aa45-cb297d9067b2" containerName="registry-server" Nov 25 16:56:10 crc kubenswrapper[4743]: E1125 16:56:10.654408 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb340090-5f76-4162-aa45-cb297d9067b2" containerName="extract-content" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.654414 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb340090-5f76-4162-aa45-cb297d9067b2" containerName="extract-content" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.654816 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb340090-5f76-4162-aa45-cb297d9067b2" containerName="registry-server" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.654844 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="47459f25-57d0-4c84-8f42-81c8698769bd" containerName="tempest-tests-tempest-tests-runner" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.655829 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.658460 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-hl8vr" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.662530 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.736569 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ck5b\" (UniqueName: \"kubernetes.io/projected/4e216387-c508-4f98-adff-3b4a3e97003e-kube-api-access-7ck5b\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4e216387-c508-4f98-adff-3b4a3e97003e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.736962 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4e216387-c508-4f98-adff-3b4a3e97003e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.838184 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4e216387-c508-4f98-adff-3b4a3e97003e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.838342 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ck5b\" (UniqueName: \"kubernetes.io/projected/4e216387-c508-4f98-adff-3b4a3e97003e-kube-api-access-7ck5b\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4e216387-c508-4f98-adff-3b4a3e97003e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.839199 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4e216387-c508-4f98-adff-3b4a3e97003e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.855513 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ck5b\" (UniqueName: \"kubernetes.io/projected/4e216387-c508-4f98-adff-3b4a3e97003e-kube-api-access-7ck5b\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4e216387-c508-4f98-adff-3b4a3e97003e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.865857 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4e216387-c508-4f98-adff-3b4a3e97003e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 16:56:10 crc kubenswrapper[4743]: I1125 16:56:10.977029 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 16:56:11 crc kubenswrapper[4743]: I1125 16:56:11.410775 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 25 16:56:12 crc kubenswrapper[4743]: I1125 16:56:12.161900 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4e216387-c508-4f98-adff-3b4a3e97003e","Type":"ContainerStarted","Data":"f09f7080992baf94e475053c9ca185a479d65c341914c4d96f0f899dc02db658"} Nov 25 16:56:13 crc kubenswrapper[4743]: I1125 16:56:13.184360 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4e216387-c508-4f98-adff-3b4a3e97003e","Type":"ContainerStarted","Data":"496b31c2243340d6231f9a1ac7618453536e1d4d211c415ff2a38631bbed9fec"} Nov 25 16:56:13 crc kubenswrapper[4743]: I1125 16:56:13.212281 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.323534989 podStartE2EDuration="3.21226279s" podCreationTimestamp="2025-11-25 16:56:10 +0000 UTC" firstStartedPulling="2025-11-25 16:56:11.416382837 +0000 UTC m=+3450.538222386" lastFinishedPulling="2025-11-25 16:56:12.305110618 +0000 UTC m=+3451.426950187" observedRunningTime="2025-11-25 16:56:13.200350135 +0000 UTC m=+3452.322189684" watchObservedRunningTime="2025-11-25 16:56:13.21226279 +0000 UTC m=+3452.334102339" Nov 25 16:56:20 crc kubenswrapper[4743]: I1125 16:56:20.077189 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:56:20 crc kubenswrapper[4743]: I1125 16:56:20.078773 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:56:20 crc kubenswrapper[4743]: I1125 16:56:20.079145 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 16:56:20 crc kubenswrapper[4743]: I1125 16:56:20.251061 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b75d2571f6e149112ed55aa52e02a74966ac42a332b403fa4f32ad757b2ef8c"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:56:20 crc kubenswrapper[4743]: I1125 16:56:20.251165 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://5b75d2571f6e149112ed55aa52e02a74966ac42a332b403fa4f32ad757b2ef8c" gracePeriod=600 Nov 25 16:56:21 crc kubenswrapper[4743]: I1125 16:56:21.261513 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="5b75d2571f6e149112ed55aa52e02a74966ac42a332b403fa4f32ad757b2ef8c" exitCode=0 Nov 25 16:56:21 crc kubenswrapper[4743]: I1125 16:56:21.261634 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"5b75d2571f6e149112ed55aa52e02a74966ac42a332b403fa4f32ad757b2ef8c"} Nov 25 16:56:21 crc kubenswrapper[4743]: I1125 16:56:21.261937 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76"} Nov 25 16:56:21 crc kubenswrapper[4743]: I1125 16:56:21.261964 4743 scope.go:117] "RemoveContainer" containerID="03fa1d34d0ccb0db4792627177394d688c5d1105daef9149c40e466660d5df70" Nov 25 16:56:39 crc kubenswrapper[4743]: I1125 16:56:39.528239 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9x9kh/must-gather-t9zq6"] Nov 25 16:56:39 crc kubenswrapper[4743]: I1125 16:56:39.537312 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/must-gather-t9zq6" Nov 25 16:56:39 crc kubenswrapper[4743]: I1125 16:56:39.539282 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9x9kh"/"kube-root-ca.crt" Nov 25 16:56:39 crc kubenswrapper[4743]: I1125 16:56:39.539720 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9x9kh"/"openshift-service-ca.crt" Nov 25 16:56:39 crc kubenswrapper[4743]: I1125 16:56:39.561778 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9x9kh/must-gather-t9zq6"] Nov 25 16:56:39 crc kubenswrapper[4743]: I1125 16:56:39.577313 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fjls\" (UniqueName: \"kubernetes.io/projected/2bb7ee6d-6062-455a-b0cd-39aa268fc029-kube-api-access-5fjls\") pod \"must-gather-t9zq6\" (UID: \"2bb7ee6d-6062-455a-b0cd-39aa268fc029\") " pod="openshift-must-gather-9x9kh/must-gather-t9zq6" Nov 25 16:56:39 crc kubenswrapper[4743]: I1125 16:56:39.577441 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2bb7ee6d-6062-455a-b0cd-39aa268fc029-must-gather-output\") pod \"must-gather-t9zq6\" (UID: \"2bb7ee6d-6062-455a-b0cd-39aa268fc029\") " pod="openshift-must-gather-9x9kh/must-gather-t9zq6" Nov 25 16:56:39 crc kubenswrapper[4743]: I1125 16:56:39.679252 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2bb7ee6d-6062-455a-b0cd-39aa268fc029-must-gather-output\") pod \"must-gather-t9zq6\" (UID: \"2bb7ee6d-6062-455a-b0cd-39aa268fc029\") " pod="openshift-must-gather-9x9kh/must-gather-t9zq6" Nov 25 16:56:39 crc kubenswrapper[4743]: I1125 16:56:39.679411 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fjls\" (UniqueName: \"kubernetes.io/projected/2bb7ee6d-6062-455a-b0cd-39aa268fc029-kube-api-access-5fjls\") pod \"must-gather-t9zq6\" (UID: \"2bb7ee6d-6062-455a-b0cd-39aa268fc029\") " pod="openshift-must-gather-9x9kh/must-gather-t9zq6" Nov 25 16:56:39 crc kubenswrapper[4743]: I1125 16:56:39.679861 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2bb7ee6d-6062-455a-b0cd-39aa268fc029-must-gather-output\") pod \"must-gather-t9zq6\" (UID: \"2bb7ee6d-6062-455a-b0cd-39aa268fc029\") " pod="openshift-must-gather-9x9kh/must-gather-t9zq6" Nov 25 16:56:39 crc kubenswrapper[4743]: I1125 16:56:39.697248 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fjls\" (UniqueName: \"kubernetes.io/projected/2bb7ee6d-6062-455a-b0cd-39aa268fc029-kube-api-access-5fjls\") pod \"must-gather-t9zq6\" (UID: \"2bb7ee6d-6062-455a-b0cd-39aa268fc029\") " pod="openshift-must-gather-9x9kh/must-gather-t9zq6" Nov 25 16:56:39 crc kubenswrapper[4743]: I1125 16:56:39.866329 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/must-gather-t9zq6" Nov 25 16:56:40 crc kubenswrapper[4743]: I1125 16:56:40.324370 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9x9kh/must-gather-t9zq6"] Nov 25 16:56:40 crc kubenswrapper[4743]: W1125 16:56:40.325777 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bb7ee6d_6062_455a_b0cd_39aa268fc029.slice/crio-b2902c8fd233f42e36b89371d1d9026b6596bfb1321e28e0df9d7c9494008fbe WatchSource:0}: Error finding container b2902c8fd233f42e36b89371d1d9026b6596bfb1321e28e0df9d7c9494008fbe: Status 404 returned error can't find the container with id b2902c8fd233f42e36b89371d1d9026b6596bfb1321e28e0df9d7c9494008fbe Nov 25 16:56:40 crc kubenswrapper[4743]: I1125 16:56:40.458843 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9x9kh/must-gather-t9zq6" event={"ID":"2bb7ee6d-6062-455a-b0cd-39aa268fc029","Type":"ContainerStarted","Data":"b2902c8fd233f42e36b89371d1d9026b6596bfb1321e28e0df9d7c9494008fbe"} Nov 25 16:56:44 crc kubenswrapper[4743]: I1125 16:56:44.498908 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9x9kh/must-gather-t9zq6" event={"ID":"2bb7ee6d-6062-455a-b0cd-39aa268fc029","Type":"ContainerStarted","Data":"8ba09bf397bc5d662154ff9ba1f98f6fbad99242cef84902b289fad766755e4a"} Nov 25 16:56:44 crc kubenswrapper[4743]: I1125 16:56:44.499525 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9x9kh/must-gather-t9zq6" event={"ID":"2bb7ee6d-6062-455a-b0cd-39aa268fc029","Type":"ContainerStarted","Data":"250405a547a3afcdefe2a350a3e472490e9da2e17f80a4884842306a6f680c83"} Nov 25 16:56:44 crc kubenswrapper[4743]: I1125 16:56:44.512966 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9x9kh/must-gather-t9zq6" podStartSLOduration=1.9462630600000002 podStartE2EDuration="5.512950659s" podCreationTimestamp="2025-11-25 16:56:39 +0000 UTC" firstStartedPulling="2025-11-25 16:56:40.328121798 +0000 UTC m=+3479.449961367" lastFinishedPulling="2025-11-25 16:56:43.894809417 +0000 UTC m=+3483.016648966" observedRunningTime="2025-11-25 16:56:44.512180274 +0000 UTC m=+3483.634019823" watchObservedRunningTime="2025-11-25 16:56:44.512950659 +0000 UTC m=+3483.634790208" Nov 25 16:56:47 crc kubenswrapper[4743]: I1125 16:56:47.713023 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9x9kh/crc-debug-5ts97"] Nov 25 16:56:47 crc kubenswrapper[4743]: I1125 16:56:47.714866 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/crc-debug-5ts97" Nov 25 16:56:47 crc kubenswrapper[4743]: I1125 16:56:47.717199 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9x9kh"/"default-dockercfg-jc78b" Nov 25 16:56:47 crc kubenswrapper[4743]: I1125 16:56:47.732308 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt92k\" (UniqueName: \"kubernetes.io/projected/659dfa8d-9f99-4cd8-a370-7c7ca9f4a977-kube-api-access-zt92k\") pod \"crc-debug-5ts97\" (UID: \"659dfa8d-9f99-4cd8-a370-7c7ca9f4a977\") " pod="openshift-must-gather-9x9kh/crc-debug-5ts97" Nov 25 16:56:47 crc kubenswrapper[4743]: I1125 16:56:47.732795 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/659dfa8d-9f99-4cd8-a370-7c7ca9f4a977-host\") pod \"crc-debug-5ts97\" (UID: \"659dfa8d-9f99-4cd8-a370-7c7ca9f4a977\") " pod="openshift-must-gather-9x9kh/crc-debug-5ts97" Nov 25 16:56:47 crc kubenswrapper[4743]: I1125 16:56:47.834339 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt92k\" (UniqueName: \"kubernetes.io/projected/659dfa8d-9f99-4cd8-a370-7c7ca9f4a977-kube-api-access-zt92k\") pod \"crc-debug-5ts97\" (UID: \"659dfa8d-9f99-4cd8-a370-7c7ca9f4a977\") " pod="openshift-must-gather-9x9kh/crc-debug-5ts97" Nov 25 16:56:47 crc kubenswrapper[4743]: I1125 16:56:47.836037 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/659dfa8d-9f99-4cd8-a370-7c7ca9f4a977-host\") pod \"crc-debug-5ts97\" (UID: \"659dfa8d-9f99-4cd8-a370-7c7ca9f4a977\") " pod="openshift-must-gather-9x9kh/crc-debug-5ts97" Nov 25 16:56:47 crc kubenswrapper[4743]: I1125 16:56:47.836147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/659dfa8d-9f99-4cd8-a370-7c7ca9f4a977-host\") pod \"crc-debug-5ts97\" (UID: \"659dfa8d-9f99-4cd8-a370-7c7ca9f4a977\") " pod="openshift-must-gather-9x9kh/crc-debug-5ts97" Nov 25 16:56:47 crc kubenswrapper[4743]: I1125 16:56:47.858197 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt92k\" (UniqueName: \"kubernetes.io/projected/659dfa8d-9f99-4cd8-a370-7c7ca9f4a977-kube-api-access-zt92k\") pod \"crc-debug-5ts97\" (UID: \"659dfa8d-9f99-4cd8-a370-7c7ca9f4a977\") " pod="openshift-must-gather-9x9kh/crc-debug-5ts97" Nov 25 16:56:48 crc kubenswrapper[4743]: I1125 16:56:48.034840 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/crc-debug-5ts97" Nov 25 16:56:48 crc kubenswrapper[4743]: W1125 16:56:48.074249 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod659dfa8d_9f99_4cd8_a370_7c7ca9f4a977.slice/crio-50983bb4cf15445ea78cc9f49e8f211059ed139b1a024843cea8a2b1ecabd74e WatchSource:0}: Error finding container 50983bb4cf15445ea78cc9f49e8f211059ed139b1a024843cea8a2b1ecabd74e: Status 404 returned error can't find the container with id 50983bb4cf15445ea78cc9f49e8f211059ed139b1a024843cea8a2b1ecabd74e Nov 25 16:56:48 crc kubenswrapper[4743]: I1125 16:56:48.536752 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9x9kh/crc-debug-5ts97" event={"ID":"659dfa8d-9f99-4cd8-a370-7c7ca9f4a977","Type":"ContainerStarted","Data":"50983bb4cf15445ea78cc9f49e8f211059ed139b1a024843cea8a2b1ecabd74e"} Nov 25 16:56:59 crc kubenswrapper[4743]: I1125 16:56:59.637425 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9x9kh/crc-debug-5ts97" event={"ID":"659dfa8d-9f99-4cd8-a370-7c7ca9f4a977","Type":"ContainerStarted","Data":"e20a738fe814d37ef202a5e404d3bf6da9bf380e3887647ab9440a2eb86aa677"} Nov 25 16:56:59 crc kubenswrapper[4743]: I1125 16:56:59.654101 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9x9kh/crc-debug-5ts97" podStartSLOduration=1.381451809 podStartE2EDuration="12.654076882s" podCreationTimestamp="2025-11-25 16:56:47 +0000 UTC" firstStartedPulling="2025-11-25 16:56:48.076763667 +0000 UTC m=+3487.198603216" lastFinishedPulling="2025-11-25 16:56:59.34938874 +0000 UTC m=+3498.471228289" observedRunningTime="2025-11-25 16:56:59.650029716 +0000 UTC m=+3498.771869265" watchObservedRunningTime="2025-11-25 16:56:59.654076882 +0000 UTC m=+3498.775916431" Nov 25 16:57:41 crc kubenswrapper[4743]: I1125 16:57:41.024564 4743 generic.go:334] "Generic (PLEG): container finished" podID="659dfa8d-9f99-4cd8-a370-7c7ca9f4a977" containerID="e20a738fe814d37ef202a5e404d3bf6da9bf380e3887647ab9440a2eb86aa677" exitCode=0 Nov 25 16:57:41 crc kubenswrapper[4743]: I1125 16:57:41.024678 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9x9kh/crc-debug-5ts97" event={"ID":"659dfa8d-9f99-4cd8-a370-7c7ca9f4a977","Type":"ContainerDied","Data":"e20a738fe814d37ef202a5e404d3bf6da9bf380e3887647ab9440a2eb86aa677"} Nov 25 16:57:42 crc kubenswrapper[4743]: I1125 16:57:42.132278 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/crc-debug-5ts97" Nov 25 16:57:42 crc kubenswrapper[4743]: I1125 16:57:42.173516 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9x9kh/crc-debug-5ts97"] Nov 25 16:57:42 crc kubenswrapper[4743]: I1125 16:57:42.181844 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9x9kh/crc-debug-5ts97"] Nov 25 16:57:42 crc kubenswrapper[4743]: I1125 16:57:42.267513 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/659dfa8d-9f99-4cd8-a370-7c7ca9f4a977-host\") pod \"659dfa8d-9f99-4cd8-a370-7c7ca9f4a977\" (UID: \"659dfa8d-9f99-4cd8-a370-7c7ca9f4a977\") " Nov 25 16:57:42 crc kubenswrapper[4743]: I1125 16:57:42.267576 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt92k\" (UniqueName: \"kubernetes.io/projected/659dfa8d-9f99-4cd8-a370-7c7ca9f4a977-kube-api-access-zt92k\") pod \"659dfa8d-9f99-4cd8-a370-7c7ca9f4a977\" (UID: \"659dfa8d-9f99-4cd8-a370-7c7ca9f4a977\") " Nov 25 16:57:42 crc kubenswrapper[4743]: I1125 16:57:42.267660 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/659dfa8d-9f99-4cd8-a370-7c7ca9f4a977-host" (OuterVolumeSpecName: "host") pod "659dfa8d-9f99-4cd8-a370-7c7ca9f4a977" (UID: "659dfa8d-9f99-4cd8-a370-7c7ca9f4a977"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:57:42 crc kubenswrapper[4743]: I1125 16:57:42.268057 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/659dfa8d-9f99-4cd8-a370-7c7ca9f4a977-host\") on node \"crc\" DevicePath \"\"" Nov 25 16:57:42 crc kubenswrapper[4743]: I1125 16:57:42.273445 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/659dfa8d-9f99-4cd8-a370-7c7ca9f4a977-kube-api-access-zt92k" (OuterVolumeSpecName: "kube-api-access-zt92k") pod "659dfa8d-9f99-4cd8-a370-7c7ca9f4a977" (UID: "659dfa8d-9f99-4cd8-a370-7c7ca9f4a977"). InnerVolumeSpecName "kube-api-access-zt92k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:57:42 crc kubenswrapper[4743]: I1125 16:57:42.369786 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt92k\" (UniqueName: \"kubernetes.io/projected/659dfa8d-9f99-4cd8-a370-7c7ca9f4a977-kube-api-access-zt92k\") on node \"crc\" DevicePath \"\"" Nov 25 16:57:43 crc kubenswrapper[4743]: I1125 16:57:43.043710 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50983bb4cf15445ea78cc9f49e8f211059ed139b1a024843cea8a2b1ecabd74e" Nov 25 16:57:43 crc kubenswrapper[4743]: I1125 16:57:43.043762 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/crc-debug-5ts97" Nov 25 16:57:43 crc kubenswrapper[4743]: I1125 16:57:43.309813 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9x9kh/crc-debug-qchh2"] Nov 25 16:57:43 crc kubenswrapper[4743]: E1125 16:57:43.310193 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659dfa8d-9f99-4cd8-a370-7c7ca9f4a977" containerName="container-00" Nov 25 16:57:43 crc kubenswrapper[4743]: I1125 16:57:43.310204 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="659dfa8d-9f99-4cd8-a370-7c7ca9f4a977" containerName="container-00" Nov 25 16:57:43 crc kubenswrapper[4743]: I1125 16:57:43.310374 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="659dfa8d-9f99-4cd8-a370-7c7ca9f4a977" containerName="container-00" Nov 25 16:57:43 crc kubenswrapper[4743]: I1125 16:57:43.310977 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/crc-debug-qchh2" Nov 25 16:57:43 crc kubenswrapper[4743]: I1125 16:57:43.313499 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9x9kh"/"default-dockercfg-jc78b" Nov 25 16:57:43 crc kubenswrapper[4743]: I1125 16:57:43.386703 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kvfw\" (UniqueName: \"kubernetes.io/projected/4feaeb77-fda9-421f-9957-14afc0a299e9-kube-api-access-6kvfw\") pod \"crc-debug-qchh2\" (UID: \"4feaeb77-fda9-421f-9957-14afc0a299e9\") " pod="openshift-must-gather-9x9kh/crc-debug-qchh2" Nov 25 16:57:43 crc kubenswrapper[4743]: I1125 16:57:43.386889 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4feaeb77-fda9-421f-9957-14afc0a299e9-host\") pod \"crc-debug-qchh2\" (UID: \"4feaeb77-fda9-421f-9957-14afc0a299e9\") " pod="openshift-must-gather-9x9kh/crc-debug-qchh2" Nov 25 16:57:43 crc kubenswrapper[4743]: I1125 16:57:43.488179 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kvfw\" (UniqueName: \"kubernetes.io/projected/4feaeb77-fda9-421f-9957-14afc0a299e9-kube-api-access-6kvfw\") pod \"crc-debug-qchh2\" (UID: \"4feaeb77-fda9-421f-9957-14afc0a299e9\") " pod="openshift-must-gather-9x9kh/crc-debug-qchh2" Nov 25 16:57:43 crc kubenswrapper[4743]: I1125 16:57:43.488350 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4feaeb77-fda9-421f-9957-14afc0a299e9-host\") pod \"crc-debug-qchh2\" (UID: \"4feaeb77-fda9-421f-9957-14afc0a299e9\") " pod="openshift-must-gather-9x9kh/crc-debug-qchh2" Nov 25 16:57:43 crc kubenswrapper[4743]: I1125 16:57:43.488510 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4feaeb77-fda9-421f-9957-14afc0a299e9-host\") pod \"crc-debug-qchh2\" (UID: \"4feaeb77-fda9-421f-9957-14afc0a299e9\") " pod="openshift-must-gather-9x9kh/crc-debug-qchh2" Nov 25 16:57:43 crc kubenswrapper[4743]: I1125 16:57:43.507973 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kvfw\" (UniqueName: \"kubernetes.io/projected/4feaeb77-fda9-421f-9957-14afc0a299e9-kube-api-access-6kvfw\") pod \"crc-debug-qchh2\" (UID: \"4feaeb77-fda9-421f-9957-14afc0a299e9\") " pod="openshift-must-gather-9x9kh/crc-debug-qchh2" Nov 25 16:57:43 crc kubenswrapper[4743]: I1125 16:57:43.630505 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/crc-debug-qchh2" Nov 25 16:57:43 crc kubenswrapper[4743]: I1125 16:57:43.793613 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="659dfa8d-9f99-4cd8-a370-7c7ca9f4a977" path="/var/lib/kubelet/pods/659dfa8d-9f99-4cd8-a370-7c7ca9f4a977/volumes" Nov 25 16:57:44 crc kubenswrapper[4743]: I1125 16:57:44.052850 4743 generic.go:334] "Generic (PLEG): container finished" podID="4feaeb77-fda9-421f-9957-14afc0a299e9" containerID="5521e9d162679f2b086d11f6ea0caecfec251c003ed37938fb5da468b91ec5be" exitCode=0 Nov 25 16:57:44 crc kubenswrapper[4743]: I1125 16:57:44.052895 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9x9kh/crc-debug-qchh2" event={"ID":"4feaeb77-fda9-421f-9957-14afc0a299e9","Type":"ContainerDied","Data":"5521e9d162679f2b086d11f6ea0caecfec251c003ed37938fb5da468b91ec5be"} Nov 25 16:57:44 crc kubenswrapper[4743]: I1125 16:57:44.052926 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9x9kh/crc-debug-qchh2" event={"ID":"4feaeb77-fda9-421f-9957-14afc0a299e9","Type":"ContainerStarted","Data":"631c41fb4344d071ac0a35971aea52500c78658dcc555a45c7dfddb6484ed95a"} Nov 25 16:57:44 crc kubenswrapper[4743]: I1125 16:57:44.561461 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9x9kh/crc-debug-qchh2"] Nov 25 16:57:44 crc kubenswrapper[4743]: I1125 16:57:44.569756 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9x9kh/crc-debug-qchh2"] Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.222960 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/crc-debug-qchh2" Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.326269 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4feaeb77-fda9-421f-9957-14afc0a299e9-host\") pod \"4feaeb77-fda9-421f-9957-14afc0a299e9\" (UID: \"4feaeb77-fda9-421f-9957-14afc0a299e9\") " Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.326360 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4feaeb77-fda9-421f-9957-14afc0a299e9-host" (OuterVolumeSpecName: "host") pod "4feaeb77-fda9-421f-9957-14afc0a299e9" (UID: "4feaeb77-fda9-421f-9957-14afc0a299e9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.326424 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kvfw\" (UniqueName: \"kubernetes.io/projected/4feaeb77-fda9-421f-9957-14afc0a299e9-kube-api-access-6kvfw\") pod \"4feaeb77-fda9-421f-9957-14afc0a299e9\" (UID: \"4feaeb77-fda9-421f-9957-14afc0a299e9\") " Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.326841 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4feaeb77-fda9-421f-9957-14afc0a299e9-host\") on node \"crc\" DevicePath \"\"" Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.331626 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4feaeb77-fda9-421f-9957-14afc0a299e9-kube-api-access-6kvfw" (OuterVolumeSpecName: "kube-api-access-6kvfw") pod "4feaeb77-fda9-421f-9957-14afc0a299e9" (UID: "4feaeb77-fda9-421f-9957-14afc0a299e9"). InnerVolumeSpecName "kube-api-access-6kvfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.429104 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kvfw\" (UniqueName: \"kubernetes.io/projected/4feaeb77-fda9-421f-9957-14afc0a299e9-kube-api-access-6kvfw\") on node \"crc\" DevicePath \"\"" Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.711437 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9x9kh/crc-debug-lgbnn"] Nov 25 16:57:45 crc kubenswrapper[4743]: E1125 16:57:45.712005 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4feaeb77-fda9-421f-9957-14afc0a299e9" containerName="container-00" Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.712026 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4feaeb77-fda9-421f-9957-14afc0a299e9" containerName="container-00" Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.712330 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4feaeb77-fda9-421f-9957-14afc0a299e9" containerName="container-00" Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.713233 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/crc-debug-lgbnn" Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.788542 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4feaeb77-fda9-421f-9957-14afc0a299e9" path="/var/lib/kubelet/pods/4feaeb77-fda9-421f-9957-14afc0a299e9/volumes" Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.835938 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t97tw\" (UniqueName: \"kubernetes.io/projected/9970e66f-3212-4c54-b37f-fbf225acb1cc-kube-api-access-t97tw\") pod \"crc-debug-lgbnn\" (UID: \"9970e66f-3212-4c54-b37f-fbf225acb1cc\") " pod="openshift-must-gather-9x9kh/crc-debug-lgbnn" Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.835988 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9970e66f-3212-4c54-b37f-fbf225acb1cc-host\") pod \"crc-debug-lgbnn\" (UID: \"9970e66f-3212-4c54-b37f-fbf225acb1cc\") " pod="openshift-must-gather-9x9kh/crc-debug-lgbnn" Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.938450 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t97tw\" (UniqueName: \"kubernetes.io/projected/9970e66f-3212-4c54-b37f-fbf225acb1cc-kube-api-access-t97tw\") pod \"crc-debug-lgbnn\" (UID: \"9970e66f-3212-4c54-b37f-fbf225acb1cc\") " pod="openshift-must-gather-9x9kh/crc-debug-lgbnn" Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.938509 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9970e66f-3212-4c54-b37f-fbf225acb1cc-host\") pod \"crc-debug-lgbnn\" (UID: \"9970e66f-3212-4c54-b37f-fbf225acb1cc\") " pod="openshift-must-gather-9x9kh/crc-debug-lgbnn" Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.938656 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9970e66f-3212-4c54-b37f-fbf225acb1cc-host\") pod \"crc-debug-lgbnn\" (UID: \"9970e66f-3212-4c54-b37f-fbf225acb1cc\") " pod="openshift-must-gather-9x9kh/crc-debug-lgbnn" Nov 25 16:57:45 crc kubenswrapper[4743]: I1125 16:57:45.960169 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t97tw\" (UniqueName: \"kubernetes.io/projected/9970e66f-3212-4c54-b37f-fbf225acb1cc-kube-api-access-t97tw\") pod \"crc-debug-lgbnn\" (UID: \"9970e66f-3212-4c54-b37f-fbf225acb1cc\") " pod="openshift-must-gather-9x9kh/crc-debug-lgbnn" Nov 25 16:57:46 crc kubenswrapper[4743]: I1125 16:57:46.028361 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/crc-debug-lgbnn" Nov 25 16:57:46 crc kubenswrapper[4743]: W1125 16:57:46.053677 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9970e66f_3212_4c54_b37f_fbf225acb1cc.slice/crio-5c4014bf470baf405630d5d510f0d7efa1ea7a9a15d39527be0db863cd8cc381 WatchSource:0}: Error finding container 5c4014bf470baf405630d5d510f0d7efa1ea7a9a15d39527be0db863cd8cc381: Status 404 returned error can't find the container with id 5c4014bf470baf405630d5d510f0d7efa1ea7a9a15d39527be0db863cd8cc381 Nov 25 16:57:46 crc kubenswrapper[4743]: I1125 16:57:46.139581 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9x9kh/crc-debug-lgbnn" event={"ID":"9970e66f-3212-4c54-b37f-fbf225acb1cc","Type":"ContainerStarted","Data":"5c4014bf470baf405630d5d510f0d7efa1ea7a9a15d39527be0db863cd8cc381"} Nov 25 16:57:46 crc kubenswrapper[4743]: I1125 16:57:46.142422 4743 scope.go:117] "RemoveContainer" containerID="5521e9d162679f2b086d11f6ea0caecfec251c003ed37938fb5da468b91ec5be" Nov 25 16:57:46 crc kubenswrapper[4743]: I1125 16:57:46.142497 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/crc-debug-qchh2" Nov 25 16:57:47 crc kubenswrapper[4743]: I1125 16:57:47.155993 4743 generic.go:334] "Generic (PLEG): container finished" podID="9970e66f-3212-4c54-b37f-fbf225acb1cc" containerID="97861768522b2d8f0f8bd40877e9905e1ce783017e91ba17b899ab25c61f2bc5" exitCode=0 Nov 25 16:57:47 crc kubenswrapper[4743]: I1125 16:57:47.157908 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9x9kh/crc-debug-lgbnn" event={"ID":"9970e66f-3212-4c54-b37f-fbf225acb1cc","Type":"ContainerDied","Data":"97861768522b2d8f0f8bd40877e9905e1ce783017e91ba17b899ab25c61f2bc5"} Nov 25 16:57:47 crc kubenswrapper[4743]: I1125 16:57:47.188600 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9x9kh/crc-debug-lgbnn"] Nov 25 16:57:47 crc kubenswrapper[4743]: I1125 16:57:47.197211 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9x9kh/crc-debug-lgbnn"] Nov 25 16:57:48 crc kubenswrapper[4743]: I1125 16:57:48.282130 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/crc-debug-lgbnn" Nov 25 16:57:48 crc kubenswrapper[4743]: I1125 16:57:48.481078 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t97tw\" (UniqueName: \"kubernetes.io/projected/9970e66f-3212-4c54-b37f-fbf225acb1cc-kube-api-access-t97tw\") pod \"9970e66f-3212-4c54-b37f-fbf225acb1cc\" (UID: \"9970e66f-3212-4c54-b37f-fbf225acb1cc\") " Nov 25 16:57:48 crc kubenswrapper[4743]: I1125 16:57:48.481198 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9970e66f-3212-4c54-b37f-fbf225acb1cc-host\") pod \"9970e66f-3212-4c54-b37f-fbf225acb1cc\" (UID: \"9970e66f-3212-4c54-b37f-fbf225acb1cc\") " Nov 25 16:57:48 crc kubenswrapper[4743]: I1125 16:57:48.481352 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9970e66f-3212-4c54-b37f-fbf225acb1cc-host" (OuterVolumeSpecName: "host") pod "9970e66f-3212-4c54-b37f-fbf225acb1cc" (UID: "9970e66f-3212-4c54-b37f-fbf225acb1cc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:57:48 crc kubenswrapper[4743]: I1125 16:57:48.482135 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9970e66f-3212-4c54-b37f-fbf225acb1cc-host\") on node \"crc\" DevicePath \"\"" Nov 25 16:57:48 crc kubenswrapper[4743]: I1125 16:57:48.486483 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9970e66f-3212-4c54-b37f-fbf225acb1cc-kube-api-access-t97tw" (OuterVolumeSpecName: "kube-api-access-t97tw") pod "9970e66f-3212-4c54-b37f-fbf225acb1cc" (UID: "9970e66f-3212-4c54-b37f-fbf225acb1cc"). InnerVolumeSpecName "kube-api-access-t97tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:57:48 crc kubenswrapper[4743]: I1125 16:57:48.584667 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t97tw\" (UniqueName: \"kubernetes.io/projected/9970e66f-3212-4c54-b37f-fbf225acb1cc-kube-api-access-t97tw\") on node \"crc\" DevicePath \"\"" Nov 25 16:57:49 crc kubenswrapper[4743]: I1125 16:57:49.176535 4743 scope.go:117] "RemoveContainer" containerID="97861768522b2d8f0f8bd40877e9905e1ce783017e91ba17b899ab25c61f2bc5" Nov 25 16:57:49 crc kubenswrapper[4743]: I1125 16:57:49.176566 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/crc-debug-lgbnn" Nov 25 16:57:49 crc kubenswrapper[4743]: I1125 16:57:49.785168 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9970e66f-3212-4c54-b37f-fbf225acb1cc" path="/var/lib/kubelet/pods/9970e66f-3212-4c54-b37f-fbf225acb1cc/volumes" Nov 25 16:57:54 crc kubenswrapper[4743]: I1125 16:57:54.716105 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fslwf"] Nov 25 16:57:54 crc kubenswrapper[4743]: E1125 16:57:54.717059 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9970e66f-3212-4c54-b37f-fbf225acb1cc" containerName="container-00" Nov 25 16:57:54 crc kubenswrapper[4743]: I1125 16:57:54.717073 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9970e66f-3212-4c54-b37f-fbf225acb1cc" containerName="container-00" Nov 25 16:57:54 crc kubenswrapper[4743]: I1125 16:57:54.717250 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9970e66f-3212-4c54-b37f-fbf225acb1cc" containerName="container-00" Nov 25 16:57:54 crc kubenswrapper[4743]: I1125 16:57:54.718493 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:57:54 crc kubenswrapper[4743]: I1125 16:57:54.729703 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fslwf"] Nov 25 16:57:54 crc kubenswrapper[4743]: I1125 16:57:54.900299 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72hqz\" (UniqueName: \"kubernetes.io/projected/2ed19d00-a15c-44b0-a0ed-e50321521ad6-kube-api-access-72hqz\") pod \"redhat-marketplace-fslwf\" (UID: \"2ed19d00-a15c-44b0-a0ed-e50321521ad6\") " pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:57:54 crc kubenswrapper[4743]: I1125 16:57:54.900363 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed19d00-a15c-44b0-a0ed-e50321521ad6-catalog-content\") pod \"redhat-marketplace-fslwf\" (UID: \"2ed19d00-a15c-44b0-a0ed-e50321521ad6\") " pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:57:54 crc kubenswrapper[4743]: I1125 16:57:54.900537 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed19d00-a15c-44b0-a0ed-e50321521ad6-utilities\") pod \"redhat-marketplace-fslwf\" (UID: \"2ed19d00-a15c-44b0-a0ed-e50321521ad6\") " pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:57:55 crc kubenswrapper[4743]: I1125 16:57:55.002153 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72hqz\" (UniqueName: \"kubernetes.io/projected/2ed19d00-a15c-44b0-a0ed-e50321521ad6-kube-api-access-72hqz\") pod \"redhat-marketplace-fslwf\" (UID: \"2ed19d00-a15c-44b0-a0ed-e50321521ad6\") " pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:57:55 crc kubenswrapper[4743]: I1125 16:57:55.002198 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed19d00-a15c-44b0-a0ed-e50321521ad6-catalog-content\") pod \"redhat-marketplace-fslwf\" (UID: \"2ed19d00-a15c-44b0-a0ed-e50321521ad6\") " pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:57:55 crc kubenswrapper[4743]: I1125 16:57:55.002236 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed19d00-a15c-44b0-a0ed-e50321521ad6-utilities\") pod \"redhat-marketplace-fslwf\" (UID: \"2ed19d00-a15c-44b0-a0ed-e50321521ad6\") " pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:57:55 crc kubenswrapper[4743]: I1125 16:57:55.002675 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed19d00-a15c-44b0-a0ed-e50321521ad6-utilities\") pod \"redhat-marketplace-fslwf\" (UID: \"2ed19d00-a15c-44b0-a0ed-e50321521ad6\") " pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:57:55 crc kubenswrapper[4743]: I1125 16:57:55.003142 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed19d00-a15c-44b0-a0ed-e50321521ad6-catalog-content\") pod \"redhat-marketplace-fslwf\" (UID: \"2ed19d00-a15c-44b0-a0ed-e50321521ad6\") " pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:57:55 crc kubenswrapper[4743]: I1125 16:57:55.023509 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72hqz\" (UniqueName: \"kubernetes.io/projected/2ed19d00-a15c-44b0-a0ed-e50321521ad6-kube-api-access-72hqz\") pod \"redhat-marketplace-fslwf\" (UID: \"2ed19d00-a15c-44b0-a0ed-e50321521ad6\") " pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:57:55 crc kubenswrapper[4743]: I1125 16:57:55.050794 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:57:55 crc kubenswrapper[4743]: I1125 16:57:55.529556 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fslwf"] Nov 25 16:57:56 crc kubenswrapper[4743]: I1125 16:57:56.241225 4743 generic.go:334] "Generic (PLEG): container finished" podID="2ed19d00-a15c-44b0-a0ed-e50321521ad6" containerID="229431a2c61676f3748dd6cf7d1745a4ebff6b0be40b94df62f0f39a0201e88a" exitCode=0 Nov 25 16:57:56 crc kubenswrapper[4743]: I1125 16:57:56.241333 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fslwf" event={"ID":"2ed19d00-a15c-44b0-a0ed-e50321521ad6","Type":"ContainerDied","Data":"229431a2c61676f3748dd6cf7d1745a4ebff6b0be40b94df62f0f39a0201e88a"} Nov 25 16:57:56 crc kubenswrapper[4743]: I1125 16:57:56.241567 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fslwf" event={"ID":"2ed19d00-a15c-44b0-a0ed-e50321521ad6","Type":"ContainerStarted","Data":"acf4c95e96f23747fcf9ad6e1a5b5767b0a5a0bac9a354a1412171ab965f7479"} Nov 25 16:57:57 crc kubenswrapper[4743]: I1125 16:57:57.253125 4743 generic.go:334] "Generic (PLEG): container finished" podID="2ed19d00-a15c-44b0-a0ed-e50321521ad6" containerID="18b6e7fb1108ef12e20d078cede419631f7ab78bba52102886d4b1378497d21c" exitCode=0 Nov 25 16:57:57 crc kubenswrapper[4743]: I1125 16:57:57.253184 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fslwf" event={"ID":"2ed19d00-a15c-44b0-a0ed-e50321521ad6","Type":"ContainerDied","Data":"18b6e7fb1108ef12e20d078cede419631f7ab78bba52102886d4b1378497d21c"} Nov 25 16:57:58 crc kubenswrapper[4743]: I1125 16:57:58.263576 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fslwf" event={"ID":"2ed19d00-a15c-44b0-a0ed-e50321521ad6","Type":"ContainerStarted","Data":"11a8e84ebb375b99421668307570aedba8e90e7b3ba4452cdd264ad7599e8268"} Nov 25 16:57:58 crc kubenswrapper[4743]: I1125 16:57:58.290235 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fslwf" podStartSLOduration=2.582904269 podStartE2EDuration="4.290209926s" podCreationTimestamp="2025-11-25 16:57:54 +0000 UTC" firstStartedPulling="2025-11-25 16:57:56.243194555 +0000 UTC m=+3555.365034104" lastFinishedPulling="2025-11-25 16:57:57.950500212 +0000 UTC m=+3557.072339761" observedRunningTime="2025-11-25 16:57:58.279083596 +0000 UTC m=+3557.400923155" watchObservedRunningTime="2025-11-25 16:57:58.290209926 +0000 UTC m=+3557.412049485" Nov 25 16:58:01 crc kubenswrapper[4743]: I1125 16:58:01.505277 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-85dc5d687d-qkdzh_f3984c1f-c5d2-4a6a-9058-4c272455dcd8/barbican-api/0.log" Nov 25 16:58:01 crc kubenswrapper[4743]: I1125 16:58:01.622949 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-85dc5d687d-qkdzh_f3984c1f-c5d2-4a6a-9058-4c272455dcd8/barbican-api-log/0.log" Nov 25 16:58:01 crc kubenswrapper[4743]: I1125 16:58:01.726959 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-74944f4b54-xg775_095f59f0-0093-4e6d-8aa3-0ddc0161b213/barbican-keystone-listener/0.log" Nov 25 16:58:01 crc kubenswrapper[4743]: I1125 16:58:01.740985 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-74944f4b54-xg775_095f59f0-0093-4e6d-8aa3-0ddc0161b213/barbican-keystone-listener-log/0.log" Nov 25 16:58:01 crc kubenswrapper[4743]: I1125 16:58:01.855905 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-c9cfd9b5-p7l4w_ef127ba1-444d-4f1c-937b-965c7ce47d1a/barbican-worker/0.log" Nov 25 16:58:01 crc kubenswrapper[4743]: I1125 16:58:01.942870 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-c9cfd9b5-p7l4w_ef127ba1-444d-4f1c-937b-965c7ce47d1a/barbican-worker-log/0.log" Nov 25 16:58:02 crc kubenswrapper[4743]: I1125 16:58:02.055808 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd_14bc3c31-f23e-4c67-a989-e85613bd5607/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:02 crc kubenswrapper[4743]: I1125 16:58:02.160939 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95304982-4885-4344-914e-1a4693b5eed1/ceilometer-notification-agent/0.log" Nov 25 16:58:02 crc kubenswrapper[4743]: I1125 16:58:02.167423 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95304982-4885-4344-914e-1a4693b5eed1/ceilometer-central-agent/0.log" Nov 25 16:58:02 crc kubenswrapper[4743]: I1125 16:58:02.240502 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95304982-4885-4344-914e-1a4693b5eed1/proxy-httpd/0.log" Nov 25 16:58:02 crc kubenswrapper[4743]: I1125 16:58:02.268247 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95304982-4885-4344-914e-1a4693b5eed1/sg-core/0.log" Nov 25 16:58:02 crc kubenswrapper[4743]: I1125 16:58:02.586648 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f8e01616-0594-420e-9180-2c348780903a/cinder-api-log/0.log" Nov 25 16:58:02 crc kubenswrapper[4743]: I1125 16:58:02.669553 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f8e01616-0594-420e-9180-2c348780903a/cinder-api/0.log" Nov 25 16:58:02 crc kubenswrapper[4743]: I1125 16:58:02.705120 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0fd119f0-4e29-4050-baee-a0261c883787/cinder-scheduler/0.log" Nov 25 16:58:02 crc kubenswrapper[4743]: I1125 16:58:02.715910 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0fd119f0-4e29-4050-baee-a0261c883787/probe/0.log" Nov 25 16:58:02 crc kubenswrapper[4743]: I1125 16:58:02.903439 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-pzm62_cf95749b-9f3b-4df2-afaf-869ec45e1807/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:02 crc kubenswrapper[4743]: I1125 16:58:02.936548 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt_78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:03 crc kubenswrapper[4743]: I1125 16:58:03.087825 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-96jwb_a587d785-9e96-41ef-95b8-a247f530e971/init/0.log" Nov 25 16:58:03 crc kubenswrapper[4743]: I1125 16:58:03.318952 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-96jwb_a587d785-9e96-41ef-95b8-a247f530e971/init/0.log" Nov 25 16:58:03 crc kubenswrapper[4743]: I1125 16:58:03.344090 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt_370e248d-8977-4a95-ac29-df64918b694b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:03 crc kubenswrapper[4743]: I1125 16:58:03.378500 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-96jwb_a587d785-9e96-41ef-95b8-a247f530e971/dnsmasq-dns/0.log" Nov 25 16:58:03 crc kubenswrapper[4743]: I1125 16:58:03.524338 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_791e2d3a-4b72-42dc-9df0-0a185817f347/glance-httpd/0.log" Nov 25 16:58:03 crc kubenswrapper[4743]: I1125 16:58:03.562364 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_791e2d3a-4b72-42dc-9df0-0a185817f347/glance-log/0.log" Nov 25 16:58:03 crc kubenswrapper[4743]: I1125 16:58:03.734785 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_07d575a4-6889-4bf6-ad82-4c7e756607d2/glance-log/0.log" Nov 25 16:58:03 crc kubenswrapper[4743]: I1125 16:58:03.739947 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_07d575a4-6889-4bf6-ad82-4c7e756607d2/glance-httpd/0.log" Nov 25 16:58:03 crc kubenswrapper[4743]: I1125 16:58:03.882454 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7495cddcb-ghpkx_1e54ceb1-969a-4172-9928-7e424dd38b5b/horizon/0.log" Nov 25 16:58:04 crc kubenswrapper[4743]: I1125 16:58:04.074303 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-kf689_522738de-cb3a-424d-ae01-b73bd3bcd8c6/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:04 crc kubenswrapper[4743]: I1125 16:58:04.238327 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7495cddcb-ghpkx_1e54ceb1-969a-4172-9928-7e424dd38b5b/horizon-log/0.log" Nov 25 16:58:04 crc kubenswrapper[4743]: I1125 16:58:04.309393 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-qtx8v_b1c2dd10-3126-4c40-a55f-679ed3441056/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:04 crc kubenswrapper[4743]: I1125 16:58:04.535779 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_aa4a8c5c-3c11-45e5-815e-bebe62e1b165/kube-state-metrics/0.log" Nov 25 16:58:04 crc kubenswrapper[4743]: I1125 16:58:04.571822 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-848747fd7b-bljn8_0e5a8995-2691-4c7f-baee-bf9cdf1b2427/keystone-api/0.log" Nov 25 16:58:04 crc kubenswrapper[4743]: I1125 16:58:04.781337 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6_7568caf6-7fa3-429a-90f2-40cbd4dece9d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:05 crc kubenswrapper[4743]: I1125 16:58:05.050930 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:58:05 crc kubenswrapper[4743]: I1125 16:58:05.050970 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:58:05 crc kubenswrapper[4743]: I1125 16:58:05.131120 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:58:05 crc kubenswrapper[4743]: I1125 16:58:05.180875 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cc5fc48dc-hkvc8_c8823220-9bb8-44a4-a4a6-00661d8e2fad/neutron-httpd/0.log" Nov 25 16:58:05 crc kubenswrapper[4743]: I1125 16:58:05.231720 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cc5fc48dc-hkvc8_c8823220-9bb8-44a4-a4a6-00661d8e2fad/neutron-api/0.log" Nov 25 16:58:05 crc kubenswrapper[4743]: I1125 16:58:05.370653 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:58:05 crc kubenswrapper[4743]: I1125 16:58:05.421939 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fslwf"] Nov 25 16:58:05 crc kubenswrapper[4743]: I1125 16:58:05.473524 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv_389b43ba-821f-48b6-b924-46ddda4e2d11/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:05 crc kubenswrapper[4743]: I1125 16:58:05.955303 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a5aaab81-18e9-41e2-8db4-00c4a09b7710/nova-api-log/0.log" Nov 25 16:58:05 crc kubenswrapper[4743]: I1125 16:58:05.973470 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_da59725d-9914-40d1-b70b-57df96de1db2/nova-cell0-conductor-conductor/0.log" Nov 25 16:58:06 crc kubenswrapper[4743]: I1125 16:58:06.140514 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a5aaab81-18e9-41e2-8db4-00c4a09b7710/nova-api-api/0.log" Nov 25 16:58:06 crc kubenswrapper[4743]: I1125 16:58:06.241897 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d229e467-a473-44bf-9f13-73155f796874/nova-cell1-novncproxy-novncproxy/0.log" Nov 25 16:58:06 crc kubenswrapper[4743]: I1125 16:58:06.244725 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_66fd62a5-dbf6-4ff3-a910-1969f287da86/nova-cell1-conductor-conductor/0.log" Nov 25 16:58:06 crc kubenswrapper[4743]: I1125 16:58:06.429953 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-wfdh2_a80ee7c3-2c23-4079-994f-b04e8a21516e/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:06 crc kubenswrapper[4743]: I1125 16:58:06.539916 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5b78218b-03ac-4dbb-89cf-58580f5367d3/nova-metadata-log/0.log" Nov 25 16:58:06 crc kubenswrapper[4743]: I1125 16:58:06.855572 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d/mysql-bootstrap/0.log" Nov 25 16:58:06 crc kubenswrapper[4743]: I1125 16:58:06.900125 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5f82a83d-3847-490f-b9dd-5dda26140b80/nova-scheduler-scheduler/0.log" Nov 25 16:58:07 crc kubenswrapper[4743]: I1125 16:58:07.080236 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d/galera/0.log" Nov 25 16:58:07 crc kubenswrapper[4743]: I1125 16:58:07.136883 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d/mysql-bootstrap/0.log" Nov 25 16:58:07 crc kubenswrapper[4743]: I1125 16:58:07.327334 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e54e0104-81dc-49fc-9233-135bf00032be/mysql-bootstrap/0.log" Nov 25 16:58:07 crc kubenswrapper[4743]: I1125 16:58:07.343716 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fslwf" podUID="2ed19d00-a15c-44b0-a0ed-e50321521ad6" containerName="registry-server" containerID="cri-o://11a8e84ebb375b99421668307570aedba8e90e7b3ba4452cdd264ad7599e8268" gracePeriod=2 Nov 25 16:58:07 crc kubenswrapper[4743]: I1125 16:58:07.566824 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e54e0104-81dc-49fc-9233-135bf00032be/mysql-bootstrap/0.log" Nov 25 16:58:07 crc kubenswrapper[4743]: I1125 16:58:07.577433 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e54e0104-81dc-49fc-9233-135bf00032be/galera/0.log" Nov 25 16:58:07 crc kubenswrapper[4743]: I1125 16:58:07.786649 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-8dtsl_7750901a-7566-4d94-8cb5-5aff66e22116/ovn-controller/0.log" Nov 25 16:58:07 crc kubenswrapper[4743]: I1125 16:58:07.854126 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:58:07 crc kubenswrapper[4743]: I1125 16:58:07.880801 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b65064de-e088-4c89-9767-db14019b6e44/openstackclient/0.log" Nov 25 16:58:07 crc kubenswrapper[4743]: I1125 16:58:07.954657 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5b78218b-03ac-4dbb-89cf-58580f5367d3/nova-metadata-metadata/0.log" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.032542 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed19d00-a15c-44b0-a0ed-e50321521ad6-catalog-content\") pod \"2ed19d00-a15c-44b0-a0ed-e50321521ad6\" (UID: \"2ed19d00-a15c-44b0-a0ed-e50321521ad6\") " Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.032724 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72hqz\" (UniqueName: \"kubernetes.io/projected/2ed19d00-a15c-44b0-a0ed-e50321521ad6-kube-api-access-72hqz\") pod \"2ed19d00-a15c-44b0-a0ed-e50321521ad6\" (UID: \"2ed19d00-a15c-44b0-a0ed-e50321521ad6\") " Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.032954 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed19d00-a15c-44b0-a0ed-e50321521ad6-utilities\") pod \"2ed19d00-a15c-44b0-a0ed-e50321521ad6\" (UID: \"2ed19d00-a15c-44b0-a0ed-e50321521ad6\") " Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.034308 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ed19d00-a15c-44b0-a0ed-e50321521ad6-utilities" (OuterVolumeSpecName: "utilities") pod "2ed19d00-a15c-44b0-a0ed-e50321521ad6" (UID: "2ed19d00-a15c-44b0-a0ed-e50321521ad6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.056890 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed19d00-a15c-44b0-a0ed-e50321521ad6-kube-api-access-72hqz" (OuterVolumeSpecName: "kube-api-access-72hqz") pod "2ed19d00-a15c-44b0-a0ed-e50321521ad6" (UID: "2ed19d00-a15c-44b0-a0ed-e50321521ad6"). InnerVolumeSpecName "kube-api-access-72hqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.069712 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ed19d00-a15c-44b0-a0ed-e50321521ad6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ed19d00-a15c-44b0-a0ed-e50321521ad6" (UID: "2ed19d00-a15c-44b0-a0ed-e50321521ad6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.134677 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ed19d00-a15c-44b0-a0ed-e50321521ad6-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.134706 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ed19d00-a15c-44b0-a0ed-e50321521ad6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.134717 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72hqz\" (UniqueName: \"kubernetes.io/projected/2ed19d00-a15c-44b0-a0ed-e50321521ad6-kube-api-access-72hqz\") on node \"crc\" DevicePath \"\"" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.153828 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-znflv_5835f976-c6b4-4bd9-9893-70905ce30872/openstack-network-exporter/0.log" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.310547 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lmnwx_ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc/ovsdb-server-init/0.log" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.353876 4743 generic.go:334] "Generic (PLEG): container finished" podID="2ed19d00-a15c-44b0-a0ed-e50321521ad6" containerID="11a8e84ebb375b99421668307570aedba8e90e7b3ba4452cdd264ad7599e8268" exitCode=0 Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.353917 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fslwf" event={"ID":"2ed19d00-a15c-44b0-a0ed-e50321521ad6","Type":"ContainerDied","Data":"11a8e84ebb375b99421668307570aedba8e90e7b3ba4452cdd264ad7599e8268"} Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.353941 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fslwf" event={"ID":"2ed19d00-a15c-44b0-a0ed-e50321521ad6","Type":"ContainerDied","Data":"acf4c95e96f23747fcf9ad6e1a5b5767b0a5a0bac9a354a1412171ab965f7479"} Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.353956 4743 scope.go:117] "RemoveContainer" containerID="11a8e84ebb375b99421668307570aedba8e90e7b3ba4452cdd264ad7599e8268" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.354317 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fslwf" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.374734 4743 scope.go:117] "RemoveContainer" containerID="18b6e7fb1108ef12e20d078cede419631f7ab78bba52102886d4b1378497d21c" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.399143 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fslwf"] Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.403816 4743 scope.go:117] "RemoveContainer" containerID="229431a2c61676f3748dd6cf7d1745a4ebff6b0be40b94df62f0f39a0201e88a" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.412048 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fslwf"] Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.453464 4743 scope.go:117] "RemoveContainer" containerID="11a8e84ebb375b99421668307570aedba8e90e7b3ba4452cdd264ad7599e8268" Nov 25 16:58:08 crc kubenswrapper[4743]: E1125 16:58:08.454034 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11a8e84ebb375b99421668307570aedba8e90e7b3ba4452cdd264ad7599e8268\": container with ID starting with 11a8e84ebb375b99421668307570aedba8e90e7b3ba4452cdd264ad7599e8268 not found: ID does not exist" containerID="11a8e84ebb375b99421668307570aedba8e90e7b3ba4452cdd264ad7599e8268" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.454074 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11a8e84ebb375b99421668307570aedba8e90e7b3ba4452cdd264ad7599e8268"} err="failed to get container status \"11a8e84ebb375b99421668307570aedba8e90e7b3ba4452cdd264ad7599e8268\": rpc error: code = NotFound desc = could not find container \"11a8e84ebb375b99421668307570aedba8e90e7b3ba4452cdd264ad7599e8268\": container with ID starting with 11a8e84ebb375b99421668307570aedba8e90e7b3ba4452cdd264ad7599e8268 not found: ID does not exist" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.454104 4743 scope.go:117] "RemoveContainer" containerID="18b6e7fb1108ef12e20d078cede419631f7ab78bba52102886d4b1378497d21c" Nov 25 16:58:08 crc kubenswrapper[4743]: E1125 16:58:08.454384 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18b6e7fb1108ef12e20d078cede419631f7ab78bba52102886d4b1378497d21c\": container with ID starting with 18b6e7fb1108ef12e20d078cede419631f7ab78bba52102886d4b1378497d21c not found: ID does not exist" containerID="18b6e7fb1108ef12e20d078cede419631f7ab78bba52102886d4b1378497d21c" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.454432 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18b6e7fb1108ef12e20d078cede419631f7ab78bba52102886d4b1378497d21c"} err="failed to get container status \"18b6e7fb1108ef12e20d078cede419631f7ab78bba52102886d4b1378497d21c\": rpc error: code = NotFound desc = could not find container \"18b6e7fb1108ef12e20d078cede419631f7ab78bba52102886d4b1378497d21c\": container with ID starting with 18b6e7fb1108ef12e20d078cede419631f7ab78bba52102886d4b1378497d21c not found: ID does not exist" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.454467 4743 scope.go:117] "RemoveContainer" containerID="229431a2c61676f3748dd6cf7d1745a4ebff6b0be40b94df62f0f39a0201e88a" Nov 25 16:58:08 crc kubenswrapper[4743]: E1125 16:58:08.454818 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"229431a2c61676f3748dd6cf7d1745a4ebff6b0be40b94df62f0f39a0201e88a\": container with ID starting with 229431a2c61676f3748dd6cf7d1745a4ebff6b0be40b94df62f0f39a0201e88a not found: ID does not exist" containerID="229431a2c61676f3748dd6cf7d1745a4ebff6b0be40b94df62f0f39a0201e88a" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.454850 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"229431a2c61676f3748dd6cf7d1745a4ebff6b0be40b94df62f0f39a0201e88a"} err="failed to get container status \"229431a2c61676f3748dd6cf7d1745a4ebff6b0be40b94df62f0f39a0201e88a\": rpc error: code = NotFound desc = could not find container \"229431a2c61676f3748dd6cf7d1745a4ebff6b0be40b94df62f0f39a0201e88a\": container with ID starting with 229431a2c61676f3748dd6cf7d1745a4ebff6b0be40b94df62f0f39a0201e88a not found: ID does not exist" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.605817 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lmnwx_ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc/ovsdb-server-init/0.log" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.674095 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lmnwx_ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc/ovs-vswitchd/0.log" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.688269 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lmnwx_ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc/ovsdb-server/0.log" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.852811 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-75jlm_2073fba4-3e3f-4c49-ae69-265ffbc47f68/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.877090 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eb743ab5-16ea-4be4-95ee-00a87767602e/openstack-network-exporter/0.log" Nov 25 16:58:08 crc kubenswrapper[4743]: I1125 16:58:08.971084 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eb743ab5-16ea-4be4-95ee-00a87767602e/ovn-northd/0.log" Nov 25 16:58:09 crc kubenswrapper[4743]: I1125 16:58:09.049623 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1/ovsdbserver-nb/0.log" Nov 25 16:58:09 crc kubenswrapper[4743]: I1125 16:58:09.148812 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1/openstack-network-exporter/0.log" Nov 25 16:58:09 crc kubenswrapper[4743]: I1125 16:58:09.249045 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1c6f500e-afe2-4505-8a75-d68f109b80dc/openstack-network-exporter/0.log" Nov 25 16:58:09 crc kubenswrapper[4743]: I1125 16:58:09.401974 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1c6f500e-afe2-4505-8a75-d68f109b80dc/ovsdbserver-sb/0.log" Nov 25 16:58:09 crc kubenswrapper[4743]: I1125 16:58:09.499065 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6dd8557654-lgr92_659f8a21-e29e-47be-903b-742de8ec9b22/placement-api/0.log" Nov 25 16:58:09 crc kubenswrapper[4743]: I1125 16:58:09.585820 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6dd8557654-lgr92_659f8a21-e29e-47be-903b-742de8ec9b22/placement-log/0.log" Nov 25 16:58:09 crc kubenswrapper[4743]: I1125 16:58:09.700284 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f54afd9a-9279-4fd3-a14a-6742d1ad9d96/setup-container/0.log" Nov 25 16:58:09 crc kubenswrapper[4743]: I1125 16:58:09.784864 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ed19d00-a15c-44b0-a0ed-e50321521ad6" path="/var/lib/kubelet/pods/2ed19d00-a15c-44b0-a0ed-e50321521ad6/volumes" Nov 25 16:58:09 crc kubenswrapper[4743]: I1125 16:58:09.916323 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f54afd9a-9279-4fd3-a14a-6742d1ad9d96/rabbitmq/0.log" Nov 25 16:58:09 crc kubenswrapper[4743]: I1125 16:58:09.966118 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f54afd9a-9279-4fd3-a14a-6742d1ad9d96/setup-container/0.log" Nov 25 16:58:09 crc kubenswrapper[4743]: I1125 16:58:09.971505 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1337639a-d66d-43cb-a7d9-487f22d1d804/setup-container/0.log" Nov 25 16:58:10 crc kubenswrapper[4743]: I1125 16:58:10.188134 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1337639a-d66d-43cb-a7d9-487f22d1d804/setup-container/0.log" Nov 25 16:58:10 crc kubenswrapper[4743]: I1125 16:58:10.191092 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt_0d82431d-8bd6-4d1a-850d-d8c543994421/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:10 crc kubenswrapper[4743]: I1125 16:58:10.196967 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1337639a-d66d-43cb-a7d9-487f22d1d804/rabbitmq/0.log" Nov 25 16:58:10 crc kubenswrapper[4743]: I1125 16:58:10.356333 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-fdkdh_36ce5802-7073-425c-bd4e-1b770cfacd49/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:10 crc kubenswrapper[4743]: I1125 16:58:10.451795 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6_4ae17f2e-689f-4dd3-bc91-c52a218a8492/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:10 crc kubenswrapper[4743]: I1125 16:58:10.612633 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gqf46_a2aeec84-22a9-4f07-a1e2-12e61f62f09c/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:10 crc kubenswrapper[4743]: I1125 16:58:10.718175 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-vd9jg_3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5/ssh-known-hosts-edpm-deployment/0.log" Nov 25 16:58:10 crc kubenswrapper[4743]: I1125 16:58:10.942711 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c64568bc5-svsgq_fd562da8-2d36-4517-8d73-237580575e98/proxy-httpd/0.log" Nov 25 16:58:10 crc kubenswrapper[4743]: I1125 16:58:10.968219 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c64568bc5-svsgq_fd562da8-2d36-4517-8d73-237580575e98/proxy-server/0.log" Nov 25 16:58:11 crc kubenswrapper[4743]: I1125 16:58:11.057078 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-m926g_f5eff179-2afc-4ec2-addc-31c3c36a6fd7/swift-ring-rebalance/0.log" Nov 25 16:58:11 crc kubenswrapper[4743]: I1125 16:58:11.140650 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/account-reaper/0.log" Nov 25 16:58:11 crc kubenswrapper[4743]: I1125 16:58:11.202724 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/account-auditor/0.log" Nov 25 16:58:11 crc kubenswrapper[4743]: I1125 16:58:11.277259 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/account-replicator/0.log" Nov 25 16:58:11 crc kubenswrapper[4743]: I1125 16:58:11.361782 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/account-server/0.log" Nov 25 16:58:11 crc kubenswrapper[4743]: I1125 16:58:11.440522 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/container-replicator/0.log" Nov 25 16:58:11 crc kubenswrapper[4743]: I1125 16:58:11.465772 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/container-auditor/0.log" Nov 25 16:58:11 crc kubenswrapper[4743]: I1125 16:58:11.512383 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/container-server/0.log" Nov 25 16:58:11 crc kubenswrapper[4743]: I1125 16:58:11.740538 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/container-updater/0.log" Nov 25 16:58:11 crc kubenswrapper[4743]: I1125 16:58:11.789165 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/object-auditor/0.log" Nov 25 16:58:11 crc kubenswrapper[4743]: I1125 16:58:11.818198 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/object-expirer/0.log" Nov 25 16:58:11 crc kubenswrapper[4743]: I1125 16:58:11.854349 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/object-replicator/0.log" Nov 25 16:58:11 crc kubenswrapper[4743]: I1125 16:58:11.929342 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/object-server/0.log" Nov 25 16:58:12 crc kubenswrapper[4743]: I1125 16:58:12.027491 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/swift-recon-cron/0.log" Nov 25 16:58:12 crc kubenswrapper[4743]: I1125 16:58:12.066336 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/rsync/0.log" Nov 25 16:58:12 crc kubenswrapper[4743]: I1125 16:58:12.078835 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/object-updater/0.log" Nov 25 16:58:12 crc kubenswrapper[4743]: I1125 16:58:12.255866 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql_4dd9e80f-8e99-46a6-b669-b2ec10285463/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:12 crc kubenswrapper[4743]: I1125 16:58:12.307000 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_47459f25-57d0-4c84-8f42-81c8698769bd/tempest-tests-tempest-tests-runner/0.log" Nov 25 16:58:12 crc kubenswrapper[4743]: I1125 16:58:12.473401 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_4e216387-c508-4f98-adff-3b4a3e97003e/test-operator-logs-container/0.log" Nov 25 16:58:12 crc kubenswrapper[4743]: I1125 16:58:12.593755 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr_865996cb-146d-428e-aff6-7ce31c808ffe/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 16:58:20 crc kubenswrapper[4743]: I1125 16:58:20.076823 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:58:20 crc kubenswrapper[4743]: I1125 16:58:20.077378 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:58:22 crc kubenswrapper[4743]: I1125 16:58:22.783466 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_30545138-1305-45e8-9225-386065312213/memcached/0.log" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.522327 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fs9jd"] Nov 25 16:58:27 crc kubenswrapper[4743]: E1125 16:58:27.523494 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed19d00-a15c-44b0-a0ed-e50321521ad6" containerName="extract-content" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.523514 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed19d00-a15c-44b0-a0ed-e50321521ad6" containerName="extract-content" Nov 25 16:58:27 crc kubenswrapper[4743]: E1125 16:58:27.523540 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed19d00-a15c-44b0-a0ed-e50321521ad6" containerName="extract-utilities" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.523548 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed19d00-a15c-44b0-a0ed-e50321521ad6" containerName="extract-utilities" Nov 25 16:58:27 crc kubenswrapper[4743]: E1125 16:58:27.523566 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed19d00-a15c-44b0-a0ed-e50321521ad6" containerName="registry-server" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.523575 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed19d00-a15c-44b0-a0ed-e50321521ad6" containerName="registry-server" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.523876 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed19d00-a15c-44b0-a0ed-e50321521ad6" containerName="registry-server" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.525582 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.540404 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fs9jd"] Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.645580 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8btw\" (UniqueName: \"kubernetes.io/projected/6eb8ab64-a651-4e94-9489-1686fe286843-kube-api-access-v8btw\") pod \"community-operators-fs9jd\" (UID: \"6eb8ab64-a651-4e94-9489-1686fe286843\") " pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.645833 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb8ab64-a651-4e94-9489-1686fe286843-catalog-content\") pod \"community-operators-fs9jd\" (UID: \"6eb8ab64-a651-4e94-9489-1686fe286843\") " pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.645898 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb8ab64-a651-4e94-9489-1686fe286843-utilities\") pod \"community-operators-fs9jd\" (UID: \"6eb8ab64-a651-4e94-9489-1686fe286843\") " pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.748351 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8btw\" (UniqueName: \"kubernetes.io/projected/6eb8ab64-a651-4e94-9489-1686fe286843-kube-api-access-v8btw\") pod \"community-operators-fs9jd\" (UID: \"6eb8ab64-a651-4e94-9489-1686fe286843\") " pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.748453 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb8ab64-a651-4e94-9489-1686fe286843-catalog-content\") pod \"community-operators-fs9jd\" (UID: \"6eb8ab64-a651-4e94-9489-1686fe286843\") " pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.748481 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb8ab64-a651-4e94-9489-1686fe286843-utilities\") pod \"community-operators-fs9jd\" (UID: \"6eb8ab64-a651-4e94-9489-1686fe286843\") " pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.749094 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb8ab64-a651-4e94-9489-1686fe286843-utilities\") pod \"community-operators-fs9jd\" (UID: \"6eb8ab64-a651-4e94-9489-1686fe286843\") " pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.749105 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb8ab64-a651-4e94-9489-1686fe286843-catalog-content\") pod \"community-operators-fs9jd\" (UID: \"6eb8ab64-a651-4e94-9489-1686fe286843\") " pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.769622 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8btw\" (UniqueName: \"kubernetes.io/projected/6eb8ab64-a651-4e94-9489-1686fe286843-kube-api-access-v8btw\") pod \"community-operators-fs9jd\" (UID: \"6eb8ab64-a651-4e94-9489-1686fe286843\") " pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:27 crc kubenswrapper[4743]: I1125 16:58:27.854681 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:28 crc kubenswrapper[4743]: I1125 16:58:28.343583 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fs9jd"] Nov 25 16:58:28 crc kubenswrapper[4743]: I1125 16:58:28.546840 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fs9jd" event={"ID":"6eb8ab64-a651-4e94-9489-1686fe286843","Type":"ContainerStarted","Data":"b5658eb46997a1d438c4e828271f3dff35abe0c44398b627516d40c94bda300b"} Nov 25 16:58:29 crc kubenswrapper[4743]: I1125 16:58:29.557530 4743 generic.go:334] "Generic (PLEG): container finished" podID="6eb8ab64-a651-4e94-9489-1686fe286843" containerID="8111a98f8f3fc312bcdc5eb6879017c7bd41cb1a175cec28ac4f27ae728e303e" exitCode=0 Nov 25 16:58:29 crc kubenswrapper[4743]: I1125 16:58:29.557641 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fs9jd" event={"ID":"6eb8ab64-a651-4e94-9489-1686fe286843","Type":"ContainerDied","Data":"8111a98f8f3fc312bcdc5eb6879017c7bd41cb1a175cec28ac4f27ae728e303e"} Nov 25 16:58:31 crc kubenswrapper[4743]: I1125 16:58:31.576477 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fs9jd" event={"ID":"6eb8ab64-a651-4e94-9489-1686fe286843","Type":"ContainerStarted","Data":"2b909f4153709d7621ed5de2c4d907b77eb19fa9701b019149f518bee1b7a4d7"} Nov 25 16:58:32 crc kubenswrapper[4743]: I1125 16:58:32.587007 4743 generic.go:334] "Generic (PLEG): container finished" podID="6eb8ab64-a651-4e94-9489-1686fe286843" containerID="2b909f4153709d7621ed5de2c4d907b77eb19fa9701b019149f518bee1b7a4d7" exitCode=0 Nov 25 16:58:32 crc kubenswrapper[4743]: I1125 16:58:32.587095 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fs9jd" event={"ID":"6eb8ab64-a651-4e94-9489-1686fe286843","Type":"ContainerDied","Data":"2b909f4153709d7621ed5de2c4d907b77eb19fa9701b019149f518bee1b7a4d7"} Nov 25 16:58:33 crc kubenswrapper[4743]: I1125 16:58:33.602187 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fs9jd" event={"ID":"6eb8ab64-a651-4e94-9489-1686fe286843","Type":"ContainerStarted","Data":"18be7fa46df041fc58c65e1b95344a7e0047ad146cfe006cad51562377050e46"} Nov 25 16:58:33 crc kubenswrapper[4743]: I1125 16:58:33.633289 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fs9jd" podStartSLOduration=3.004015979 podStartE2EDuration="6.633268615s" podCreationTimestamp="2025-11-25 16:58:27 +0000 UTC" firstStartedPulling="2025-11-25 16:58:29.559632513 +0000 UTC m=+3588.681472062" lastFinishedPulling="2025-11-25 16:58:33.188885159 +0000 UTC m=+3592.310724698" observedRunningTime="2025-11-25 16:58:33.626018238 +0000 UTC m=+3592.747857807" watchObservedRunningTime="2025-11-25 16:58:33.633268615 +0000 UTC m=+3592.755108164" Nov 25 16:58:36 crc kubenswrapper[4743]: I1125 16:58:36.536082 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd_ab54bea3-befb-4d86-a499-806f480df7b0/util/0.log" Nov 25 16:58:36 crc kubenswrapper[4743]: I1125 16:58:36.693446 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd_ab54bea3-befb-4d86-a499-806f480df7b0/util/0.log" Nov 25 16:58:36 crc kubenswrapper[4743]: I1125 16:58:36.738518 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd_ab54bea3-befb-4d86-a499-806f480df7b0/pull/0.log" Nov 25 16:58:36 crc kubenswrapper[4743]: I1125 16:58:36.750812 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd_ab54bea3-befb-4d86-a499-806f480df7b0/pull/0.log" Nov 25 16:58:36 crc kubenswrapper[4743]: I1125 16:58:36.892947 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd_ab54bea3-befb-4d86-a499-806f480df7b0/util/0.log" Nov 25 16:58:36 crc kubenswrapper[4743]: I1125 16:58:36.908674 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd_ab54bea3-befb-4d86-a499-806f480df7b0/extract/0.log" Nov 25 16:58:36 crc kubenswrapper[4743]: I1125 16:58:36.980257 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd_ab54bea3-befb-4d86-a499-806f480df7b0/pull/0.log" Nov 25 16:58:37 crc kubenswrapper[4743]: I1125 16:58:37.066633 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-82tqm_9b77ce44-3830-488e-ac40-97af4d969f6e/kube-rbac-proxy/0.log" Nov 25 16:58:37 crc kubenswrapper[4743]: I1125 16:58:37.239383 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-wk72z_0729dc1e-3e2c-410e-892d-ef4773882665/kube-rbac-proxy/0.log" Nov 25 16:58:37 crc kubenswrapper[4743]: I1125 16:58:37.295244 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-wk72z_0729dc1e-3e2c-410e-892d-ef4773882665/manager/0.log" Nov 25 16:58:37 crc kubenswrapper[4743]: I1125 16:58:37.324531 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-82tqm_9b77ce44-3830-488e-ac40-97af4d969f6e/manager/0.log" Nov 25 16:58:37 crc kubenswrapper[4743]: I1125 16:58:37.444526 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-fwlrd_6a470e3c-9cac-463b-a253-308f3c386725/kube-rbac-proxy/0.log" Nov 25 16:58:37 crc kubenswrapper[4743]: I1125 16:58:37.507286 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-fwlrd_6a470e3c-9cac-463b-a253-308f3c386725/manager/0.log" Nov 25 16:58:37 crc kubenswrapper[4743]: I1125 16:58:37.574904 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-q8p2b_288e97c2-c236-4177-9a52-bcf1c6c69faa/kube-rbac-proxy/0.log" Nov 25 16:58:37 crc kubenswrapper[4743]: I1125 16:58:37.709626 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-q8p2b_288e97c2-c236-4177-9a52-bcf1c6c69faa/manager/0.log" Nov 25 16:58:37 crc kubenswrapper[4743]: I1125 16:58:37.749670 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-prmnw_29690625-5e1d-417a-b0e5-9d74645b31f7/kube-rbac-proxy/0.log" Nov 25 16:58:37 crc kubenswrapper[4743]: I1125 16:58:37.800629 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-prmnw_29690625-5e1d-417a-b0e5-9d74645b31f7/manager/0.log" Nov 25 16:58:37 crc kubenswrapper[4743]: I1125 16:58:37.855288 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:37 crc kubenswrapper[4743]: I1125 16:58:37.855336 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:37 crc kubenswrapper[4743]: I1125 16:58:37.899939 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-zq7mp_aebddcf8-77ce-4317-94c3-f29b45f93686/kube-rbac-proxy/0.log" Nov 25 16:58:37 crc kubenswrapper[4743]: I1125 16:58:37.912551 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:37 crc kubenswrapper[4743]: I1125 16:58:37.945398 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-zq7mp_aebddcf8-77ce-4317-94c3-f29b45f93686/manager/0.log" Nov 25 16:58:38 crc kubenswrapper[4743]: I1125 16:58:38.082225 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-zdjj7_d0cd465a-f903-48ef-aca1-839a390d3f12/kube-rbac-proxy/0.log" Nov 25 16:58:38 crc kubenswrapper[4743]: I1125 16:58:38.179096 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-f7g4h_38527a3c-d051-4354-a8ca-0692153762f1/kube-rbac-proxy/0.log" Nov 25 16:58:38 crc kubenswrapper[4743]: I1125 16:58:38.255126 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-zdjj7_d0cd465a-f903-48ef-aca1-839a390d3f12/manager/0.log" Nov 25 16:58:38 crc kubenswrapper[4743]: I1125 16:58:38.290320 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-f7g4h_38527a3c-d051-4354-a8ca-0692153762f1/manager/0.log" Nov 25 16:58:38 crc kubenswrapper[4743]: I1125 16:58:38.358656 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-z5rhv_bc109d32-7111-40b6-aff6-7596c933114f/kube-rbac-proxy/0.log" Nov 25 16:58:38 crc kubenswrapper[4743]: I1125 16:58:38.514095 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-z5rhv_bc109d32-7111-40b6-aff6-7596c933114f/manager/0.log" Nov 25 16:58:38 crc kubenswrapper[4743]: I1125 16:58:38.520785 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-8s47h_9caca3f1-e43f-47ab-aa8a-1248a30cfda4/kube-rbac-proxy/0.log" Nov 25 16:58:38 crc kubenswrapper[4743]: I1125 16:58:38.550043 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-8s47h_9caca3f1-e43f-47ab-aa8a-1248a30cfda4/manager/0.log" Nov 25 16:58:38 crc kubenswrapper[4743]: I1125 16:58:38.710066 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-djxpp_9f422105-6959-44e5-93e2-901fd9b84dfc/kube-rbac-proxy/0.log" Nov 25 16:58:38 crc kubenswrapper[4743]: I1125 16:58:38.729224 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:38 crc kubenswrapper[4743]: I1125 16:58:38.736550 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-djxpp_9f422105-6959-44e5-93e2-901fd9b84dfc/manager/0.log" Nov 25 16:58:38 crc kubenswrapper[4743]: I1125 16:58:38.905057 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-j9zq7_8d418847-cf8f-4977-bc14-3d4b64591e68/kube-rbac-proxy/0.log" Nov 25 16:58:38 crc kubenswrapper[4743]: I1125 16:58:38.954992 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-j9zq7_8d418847-cf8f-4977-bc14-3d4b64591e68/manager/0.log" Nov 25 16:58:38 crc kubenswrapper[4743]: I1125 16:58:38.982557 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-bxdjg_88757539-b3d4-4de5-bc96-a4cd13d5a203/kube-rbac-proxy/0.log" Nov 25 16:58:39 crc kubenswrapper[4743]: I1125 16:58:39.139979 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-bxdjg_88757539-b3d4-4de5-bc96-a4cd13d5a203/manager/0.log" Nov 25 16:58:39 crc kubenswrapper[4743]: I1125 16:58:39.178257 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-bvwmw_44e2f27d-a5d4-48cf-90f5-2f5598a2295a/kube-rbac-proxy/0.log" Nov 25 16:58:39 crc kubenswrapper[4743]: I1125 16:58:39.178415 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-bvwmw_44e2f27d-a5d4-48cf-90f5-2f5598a2295a/manager/0.log" Nov 25 16:58:39 crc kubenswrapper[4743]: I1125 16:58:39.418111 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8_94418bc2-d439-451f-91c2-c457a200825e/manager/0.log" Nov 25 16:58:39 crc kubenswrapper[4743]: I1125 16:58:39.473085 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8_94418bc2-d439-451f-91c2-c457a200825e/kube-rbac-proxy/0.log" Nov 25 16:58:39 crc kubenswrapper[4743]: I1125 16:58:39.507100 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fs9jd"] Nov 25 16:58:39 crc kubenswrapper[4743]: I1125 16:58:39.936520 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-59fdcdbdd4-vrqql_e289302a-7d4a-4b30-94fe-5babb338505d/operator/0.log" Nov 25 16:58:40 crc kubenswrapper[4743]: I1125 16:58:40.033462 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-9gt7m_7413a348-450f-4717-a52f-595041381991/registry-server/0.log" Nov 25 16:58:40 crc kubenswrapper[4743]: I1125 16:58:40.205117 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-mwm9p_e0a0f65a-b18b-479e-8ef8-5f7c6c36ccdf/kube-rbac-proxy/0.log" Nov 25 16:58:40 crc kubenswrapper[4743]: I1125 16:58:40.324948 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-g5bp8_7be9f6fc-3582-4e14-a452-daa24035d10e/kube-rbac-proxy/0.log" Nov 25 16:58:40 crc kubenswrapper[4743]: I1125 16:58:40.403100 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-mwm9p_e0a0f65a-b18b-479e-8ef8-5f7c6c36ccdf/manager/0.log" Nov 25 16:58:40 crc kubenswrapper[4743]: I1125 16:58:40.452082 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-g5bp8_7be9f6fc-3582-4e14-a452-daa24035d10e/manager/0.log" Nov 25 16:58:40 crc kubenswrapper[4743]: I1125 16:58:40.570535 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-g4sq7_66ead12f-65d1-4438-80b0-1a747105d7fc/operator/0.log" Nov 25 16:58:40 crc kubenswrapper[4743]: I1125 16:58:40.660737 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-smsbr_e2417720-74c0-4232-9f99-cdc10e485c91/kube-rbac-proxy/0.log" Nov 25 16:58:40 crc kubenswrapper[4743]: I1125 16:58:40.672756 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fs9jd" podUID="6eb8ab64-a651-4e94-9489-1686fe286843" containerName="registry-server" containerID="cri-o://18be7fa46df041fc58c65e1b95344a7e0047ad146cfe006cad51562377050e46" gracePeriod=2 Nov 25 16:58:40 crc kubenswrapper[4743]: I1125 16:58:40.753139 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-smsbr_e2417720-74c0-4232-9f99-cdc10e485c91/manager/0.log" Nov 25 16:58:40 crc kubenswrapper[4743]: I1125 16:58:40.872718 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-8lsj8_d4e33a37-ac1e-408c-b0d3-a1352daa67af/kube-rbac-proxy/0.log" Nov 25 16:58:40 crc kubenswrapper[4743]: I1125 16:58:40.997298 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-8lsj8_d4e33a37-ac1e-408c-b0d3-a1352daa67af/manager/0.log" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.063772 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-xbxjh_05a605e3-814f-45a4-8461-47cbb3330652/kube-rbac-proxy/0.log" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.087772 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-xbxjh_05a605e3-814f-45a4-8461-47cbb3330652/manager/0.log" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.168482 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.191864 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-pjmtx_5a86bde8-f04d-4bfa-842f-6c960d7232fb/kube-rbac-proxy/0.log" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.199897 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-746c9d5b4f-z2hm7_20c829b2-be6f-4f96-85c1-21279d871c99/manager/0.log" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.290638 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-pjmtx_5a86bde8-f04d-4bfa-842f-6c960d7232fb/manager/0.log" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.299273 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb8ab64-a651-4e94-9489-1686fe286843-catalog-content\") pod \"6eb8ab64-a651-4e94-9489-1686fe286843\" (UID: \"6eb8ab64-a651-4e94-9489-1686fe286843\") " Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.299651 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8btw\" (UniqueName: \"kubernetes.io/projected/6eb8ab64-a651-4e94-9489-1686fe286843-kube-api-access-v8btw\") pod \"6eb8ab64-a651-4e94-9489-1686fe286843\" (UID: \"6eb8ab64-a651-4e94-9489-1686fe286843\") " Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.299864 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb8ab64-a651-4e94-9489-1686fe286843-utilities\") pod \"6eb8ab64-a651-4e94-9489-1686fe286843\" (UID: \"6eb8ab64-a651-4e94-9489-1686fe286843\") " Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.300916 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eb8ab64-a651-4e94-9489-1686fe286843-utilities" (OuterVolumeSpecName: "utilities") pod "6eb8ab64-a651-4e94-9489-1686fe286843" (UID: "6eb8ab64-a651-4e94-9489-1686fe286843"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.305820 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eb8ab64-a651-4e94-9489-1686fe286843-kube-api-access-v8btw" (OuterVolumeSpecName: "kube-api-access-v8btw") pod "6eb8ab64-a651-4e94-9489-1686fe286843" (UID: "6eb8ab64-a651-4e94-9489-1686fe286843"). InnerVolumeSpecName "kube-api-access-v8btw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.353749 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eb8ab64-a651-4e94-9489-1686fe286843-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6eb8ab64-a651-4e94-9489-1686fe286843" (UID: "6eb8ab64-a651-4e94-9489-1686fe286843"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.401964 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb8ab64-a651-4e94-9489-1686fe286843-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.401996 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8btw\" (UniqueName: \"kubernetes.io/projected/6eb8ab64-a651-4e94-9489-1686fe286843-kube-api-access-v8btw\") on node \"crc\" DevicePath \"\"" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.402006 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb8ab64-a651-4e94-9489-1686fe286843-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.683415 4743 generic.go:334] "Generic (PLEG): container finished" podID="6eb8ab64-a651-4e94-9489-1686fe286843" containerID="18be7fa46df041fc58c65e1b95344a7e0047ad146cfe006cad51562377050e46" exitCode=0 Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.683465 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fs9jd" event={"ID":"6eb8ab64-a651-4e94-9489-1686fe286843","Type":"ContainerDied","Data":"18be7fa46df041fc58c65e1b95344a7e0047ad146cfe006cad51562377050e46"} Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.683502 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fs9jd" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.683527 4743 scope.go:117] "RemoveContainer" containerID="18be7fa46df041fc58c65e1b95344a7e0047ad146cfe006cad51562377050e46" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.683513 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fs9jd" event={"ID":"6eb8ab64-a651-4e94-9489-1686fe286843","Type":"ContainerDied","Data":"b5658eb46997a1d438c4e828271f3dff35abe0c44398b627516d40c94bda300b"} Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.705252 4743 scope.go:117] "RemoveContainer" containerID="2b909f4153709d7621ed5de2c4d907b77eb19fa9701b019149f518bee1b7a4d7" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.715369 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fs9jd"] Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.727731 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fs9jd"] Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.737995 4743 scope.go:117] "RemoveContainer" containerID="8111a98f8f3fc312bcdc5eb6879017c7bd41cb1a175cec28ac4f27ae728e303e" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.784856 4743 scope.go:117] "RemoveContainer" containerID="18be7fa46df041fc58c65e1b95344a7e0047ad146cfe006cad51562377050e46" Nov 25 16:58:41 crc kubenswrapper[4743]: E1125 16:58:41.789162 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18be7fa46df041fc58c65e1b95344a7e0047ad146cfe006cad51562377050e46\": container with ID starting with 18be7fa46df041fc58c65e1b95344a7e0047ad146cfe006cad51562377050e46 not found: ID does not exist" containerID="18be7fa46df041fc58c65e1b95344a7e0047ad146cfe006cad51562377050e46" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.789217 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18be7fa46df041fc58c65e1b95344a7e0047ad146cfe006cad51562377050e46"} err="failed to get container status \"18be7fa46df041fc58c65e1b95344a7e0047ad146cfe006cad51562377050e46\": rpc error: code = NotFound desc = could not find container \"18be7fa46df041fc58c65e1b95344a7e0047ad146cfe006cad51562377050e46\": container with ID starting with 18be7fa46df041fc58c65e1b95344a7e0047ad146cfe006cad51562377050e46 not found: ID does not exist" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.789436 4743 scope.go:117] "RemoveContainer" containerID="2b909f4153709d7621ed5de2c4d907b77eb19fa9701b019149f518bee1b7a4d7" Nov 25 16:58:41 crc kubenswrapper[4743]: E1125 16:58:41.802991 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b909f4153709d7621ed5de2c4d907b77eb19fa9701b019149f518bee1b7a4d7\": container with ID starting with 2b909f4153709d7621ed5de2c4d907b77eb19fa9701b019149f518bee1b7a4d7 not found: ID does not exist" containerID="2b909f4153709d7621ed5de2c4d907b77eb19fa9701b019149f518bee1b7a4d7" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.803036 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b909f4153709d7621ed5de2c4d907b77eb19fa9701b019149f518bee1b7a4d7"} err="failed to get container status \"2b909f4153709d7621ed5de2c4d907b77eb19fa9701b019149f518bee1b7a4d7\": rpc error: code = NotFound desc = could not find container \"2b909f4153709d7621ed5de2c4d907b77eb19fa9701b019149f518bee1b7a4d7\": container with ID starting with 2b909f4153709d7621ed5de2c4d907b77eb19fa9701b019149f518bee1b7a4d7 not found: ID does not exist" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.803068 4743 scope.go:117] "RemoveContainer" containerID="8111a98f8f3fc312bcdc5eb6879017c7bd41cb1a175cec28ac4f27ae728e303e" Nov 25 16:58:41 crc kubenswrapper[4743]: E1125 16:58:41.806098 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8111a98f8f3fc312bcdc5eb6879017c7bd41cb1a175cec28ac4f27ae728e303e\": container with ID starting with 8111a98f8f3fc312bcdc5eb6879017c7bd41cb1a175cec28ac4f27ae728e303e not found: ID does not exist" containerID="8111a98f8f3fc312bcdc5eb6879017c7bd41cb1a175cec28ac4f27ae728e303e" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.806144 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8111a98f8f3fc312bcdc5eb6879017c7bd41cb1a175cec28ac4f27ae728e303e"} err="failed to get container status \"8111a98f8f3fc312bcdc5eb6879017c7bd41cb1a175cec28ac4f27ae728e303e\": rpc error: code = NotFound desc = could not find container \"8111a98f8f3fc312bcdc5eb6879017c7bd41cb1a175cec28ac4f27ae728e303e\": container with ID starting with 8111a98f8f3fc312bcdc5eb6879017c7bd41cb1a175cec28ac4f27ae728e303e not found: ID does not exist" Nov 25 16:58:41 crc kubenswrapper[4743]: I1125 16:58:41.814880 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eb8ab64-a651-4e94-9489-1686fe286843" path="/var/lib/kubelet/pods/6eb8ab64-a651-4e94-9489-1686fe286843/volumes" Nov 25 16:58:50 crc kubenswrapper[4743]: I1125 16:58:50.077758 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:58:50 crc kubenswrapper[4743]: I1125 16:58:50.078410 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:58:56 crc kubenswrapper[4743]: I1125 16:58:56.030427 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8vr4f_fb60121d-df03-4f88-a9e5-118105c6ce94/control-plane-machine-set-operator/0.log" Nov 25 16:58:56 crc kubenswrapper[4743]: I1125 16:58:56.187041 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-j9hhf_a655856c-3900-4342-a094-dc03b84c8876/kube-rbac-proxy/0.log" Nov 25 16:58:56 crc kubenswrapper[4743]: I1125 16:58:56.199379 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-j9hhf_a655856c-3900-4342-a094-dc03b84c8876/machine-api-operator/0.log" Nov 25 16:59:06 crc kubenswrapper[4743]: I1125 16:59:06.669055 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-cn9d5_779c1d2b-063b-413b-80c1-63c1b5438aff/cert-manager-controller/0.log" Nov 25 16:59:06 crc kubenswrapper[4743]: I1125 16:59:06.813225 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-s8482_6afbc225-6b21-4fe7-80a6-9fe85ffcac89/cert-manager-cainjector/0.log" Nov 25 16:59:06 crc kubenswrapper[4743]: I1125 16:59:06.835108 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-clc4q_9e38fcf4-5a14-44c4-b8ad-970d07e82284/cert-manager-webhook/0.log" Nov 25 16:59:17 crc kubenswrapper[4743]: I1125 16:59:17.242909 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-zhqdb_994ed247-8a08-4164-89b4-c03a90c4ef5d/nmstate-console-plugin/0.log" Nov 25 16:59:17 crc kubenswrapper[4743]: I1125 16:59:17.384104 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6cznj_25afedc8-ae76-4c86-aeaa-c739b1458040/nmstate-handler/0.log" Nov 25 16:59:17 crc kubenswrapper[4743]: I1125 16:59:17.456241 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-7q45f_62b25135-7567-4053-ab8a-5df129154693/kube-rbac-proxy/0.log" Nov 25 16:59:17 crc kubenswrapper[4743]: I1125 16:59:17.504430 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-7q45f_62b25135-7567-4053-ab8a-5df129154693/nmstate-metrics/0.log" Nov 25 16:59:17 crc kubenswrapper[4743]: I1125 16:59:17.625115 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-6mtvn_a3c55fbc-7f41-4fe1-b7cf-9b5476c4c1ae/nmstate-operator/0.log" Nov 25 16:59:17 crc kubenswrapper[4743]: I1125 16:59:17.702149 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-clgq2_b423b0b1-b7c2-4a09-a332-cc9c03bfca51/nmstate-webhook/0.log" Nov 25 16:59:20 crc kubenswrapper[4743]: I1125 16:59:20.077707 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:59:20 crc kubenswrapper[4743]: I1125 16:59:20.079328 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:59:20 crc kubenswrapper[4743]: I1125 16:59:20.079813 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 16:59:20 crc kubenswrapper[4743]: I1125 16:59:20.080523 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:59:20 crc kubenswrapper[4743]: I1125 16:59:20.080624 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" gracePeriod=600 Nov 25 16:59:20 crc kubenswrapper[4743]: E1125 16:59:20.241542 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:59:20 crc kubenswrapper[4743]: I1125 16:59:20.991801 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" exitCode=0 Nov 25 16:59:20 crc kubenswrapper[4743]: I1125 16:59:20.991849 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76"} Nov 25 16:59:20 crc kubenswrapper[4743]: I1125 16:59:20.991885 4743 scope.go:117] "RemoveContainer" containerID="5b75d2571f6e149112ed55aa52e02a74966ac42a332b403fa4f32ad757b2ef8c" Nov 25 16:59:20 crc kubenswrapper[4743]: I1125 16:59:20.992685 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 16:59:20 crc kubenswrapper[4743]: E1125 16:59:20.993148 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:59:30 crc kubenswrapper[4743]: I1125 16:59:30.822116 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-685f4_d805cc14-bb31-4762-9079-dedb5e33e391/kube-rbac-proxy/0.log" Nov 25 16:59:30 crc kubenswrapper[4743]: I1125 16:59:30.941679 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-685f4_d805cc14-bb31-4762-9079-dedb5e33e391/controller/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.055089 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-frr-files/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.211685 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-reloader/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.211845 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-frr-files/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.230920 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-metrics/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.284819 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-reloader/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.464308 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-reloader/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.470392 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-metrics/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.470448 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-frr-files/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.473130 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-metrics/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.617724 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-frr-files/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.639238 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-reloader/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.678007 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/controller/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.690146 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-metrics/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.841461 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/frr-metrics/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.853995 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/kube-rbac-proxy-frr/0.log" Nov 25 16:59:31 crc kubenswrapper[4743]: I1125 16:59:31.855750 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/kube-rbac-proxy/0.log" Nov 25 16:59:32 crc kubenswrapper[4743]: I1125 16:59:32.039852 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/reloader/0.log" Nov 25 16:59:32 crc kubenswrapper[4743]: I1125 16:59:32.072818 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-q2tm6_77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec/frr-k8s-webhook-server/0.log" Nov 25 16:59:32 crc kubenswrapper[4743]: I1125 16:59:32.286085 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-b6fb67cbb-wpj49_03ed5f22-b285-4560-8572-798606c90e7b/manager/0.log" Nov 25 16:59:32 crc kubenswrapper[4743]: I1125 16:59:32.449258 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-84c6f5f694-f9nf8_bed7e486-cad7-437c-8196-4fc08dd20eb6/webhook-server/0.log" Nov 25 16:59:32 crc kubenswrapper[4743]: I1125 16:59:32.561359 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8f677_fd919f22-093b-4ba9-bbc1-06a5360f6f32/kube-rbac-proxy/0.log" Nov 25 16:59:33 crc kubenswrapper[4743]: I1125 16:59:33.108220 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8f677_fd919f22-093b-4ba9-bbc1-06a5360f6f32/speaker/0.log" Nov 25 16:59:33 crc kubenswrapper[4743]: I1125 16:59:33.206929 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/frr/0.log" Nov 25 16:59:33 crc kubenswrapper[4743]: I1125 16:59:33.775333 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 16:59:33 crc kubenswrapper[4743]: E1125 16:59:33.775684 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:59:41 crc kubenswrapper[4743]: I1125 16:59:41.889146 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cn9rn"] Nov 25 16:59:41 crc kubenswrapper[4743]: E1125 16:59:41.891692 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb8ab64-a651-4e94-9489-1686fe286843" containerName="extract-utilities" Nov 25 16:59:41 crc kubenswrapper[4743]: I1125 16:59:41.891817 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb8ab64-a651-4e94-9489-1686fe286843" containerName="extract-utilities" Nov 25 16:59:41 crc kubenswrapper[4743]: E1125 16:59:41.891914 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb8ab64-a651-4e94-9489-1686fe286843" containerName="extract-content" Nov 25 16:59:41 crc kubenswrapper[4743]: I1125 16:59:41.892005 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb8ab64-a651-4e94-9489-1686fe286843" containerName="extract-content" Nov 25 16:59:41 crc kubenswrapper[4743]: E1125 16:59:41.892130 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb8ab64-a651-4e94-9489-1686fe286843" containerName="registry-server" Nov 25 16:59:41 crc kubenswrapper[4743]: I1125 16:59:41.892214 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb8ab64-a651-4e94-9489-1686fe286843" containerName="registry-server" Nov 25 16:59:41 crc kubenswrapper[4743]: I1125 16:59:41.892541 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb8ab64-a651-4e94-9489-1686fe286843" containerName="registry-server" Nov 25 16:59:41 crc kubenswrapper[4743]: I1125 16:59:41.894236 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:41 crc kubenswrapper[4743]: I1125 16:59:41.904867 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cn9rn"] Nov 25 16:59:41 crc kubenswrapper[4743]: I1125 16:59:41.994842 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c44a79a-74c8-492d-b496-e25610aa214f-catalog-content\") pod \"redhat-operators-cn9rn\" (UID: \"4c44a79a-74c8-492d-b496-e25610aa214f\") " pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:41 crc kubenswrapper[4743]: I1125 16:59:41.995198 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c44a79a-74c8-492d-b496-e25610aa214f-utilities\") pod \"redhat-operators-cn9rn\" (UID: \"4c44a79a-74c8-492d-b496-e25610aa214f\") " pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:41 crc kubenswrapper[4743]: I1125 16:59:41.995261 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sslt\" (UniqueName: \"kubernetes.io/projected/4c44a79a-74c8-492d-b496-e25610aa214f-kube-api-access-9sslt\") pod \"redhat-operators-cn9rn\" (UID: \"4c44a79a-74c8-492d-b496-e25610aa214f\") " pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:42 crc kubenswrapper[4743]: I1125 16:59:42.097330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sslt\" (UniqueName: \"kubernetes.io/projected/4c44a79a-74c8-492d-b496-e25610aa214f-kube-api-access-9sslt\") pod \"redhat-operators-cn9rn\" (UID: \"4c44a79a-74c8-492d-b496-e25610aa214f\") " pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:42 crc kubenswrapper[4743]: I1125 16:59:42.097425 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c44a79a-74c8-492d-b496-e25610aa214f-catalog-content\") pod \"redhat-operators-cn9rn\" (UID: \"4c44a79a-74c8-492d-b496-e25610aa214f\") " pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:42 crc kubenswrapper[4743]: I1125 16:59:42.097484 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c44a79a-74c8-492d-b496-e25610aa214f-utilities\") pod \"redhat-operators-cn9rn\" (UID: \"4c44a79a-74c8-492d-b496-e25610aa214f\") " pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:42 crc kubenswrapper[4743]: I1125 16:59:42.097930 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c44a79a-74c8-492d-b496-e25610aa214f-utilities\") pod \"redhat-operators-cn9rn\" (UID: \"4c44a79a-74c8-492d-b496-e25610aa214f\") " pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:42 crc kubenswrapper[4743]: I1125 16:59:42.098514 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c44a79a-74c8-492d-b496-e25610aa214f-catalog-content\") pod \"redhat-operators-cn9rn\" (UID: \"4c44a79a-74c8-492d-b496-e25610aa214f\") " pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:42 crc kubenswrapper[4743]: I1125 16:59:42.117810 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sslt\" (UniqueName: \"kubernetes.io/projected/4c44a79a-74c8-492d-b496-e25610aa214f-kube-api-access-9sslt\") pod \"redhat-operators-cn9rn\" (UID: \"4c44a79a-74c8-492d-b496-e25610aa214f\") " pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:42 crc kubenswrapper[4743]: I1125 16:59:42.218790 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:42 crc kubenswrapper[4743]: W1125 16:59:42.675378 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c44a79a_74c8_492d_b496_e25610aa214f.slice/crio-de8f29bfb6e2690eba862856a8408213980ab74d48d0d5cf6a8691d1f2c05898 WatchSource:0}: Error finding container de8f29bfb6e2690eba862856a8408213980ab74d48d0d5cf6a8691d1f2c05898: Status 404 returned error can't find the container with id de8f29bfb6e2690eba862856a8408213980ab74d48d0d5cf6a8691d1f2c05898 Nov 25 16:59:42 crc kubenswrapper[4743]: I1125 16:59:42.677137 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cn9rn"] Nov 25 16:59:43 crc kubenswrapper[4743]: I1125 16:59:43.180468 4743 generic.go:334] "Generic (PLEG): container finished" podID="4c44a79a-74c8-492d-b496-e25610aa214f" containerID="d3ffb500b9e6513636d5cdab446d78e7fab4ae6a25128a45fb3bbe4d8fbe65d5" exitCode=0 Nov 25 16:59:43 crc kubenswrapper[4743]: I1125 16:59:43.180574 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn9rn" event={"ID":"4c44a79a-74c8-492d-b496-e25610aa214f","Type":"ContainerDied","Data":"d3ffb500b9e6513636d5cdab446d78e7fab4ae6a25128a45fb3bbe4d8fbe65d5"} Nov 25 16:59:43 crc kubenswrapper[4743]: I1125 16:59:43.180804 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn9rn" event={"ID":"4c44a79a-74c8-492d-b496-e25610aa214f","Type":"ContainerStarted","Data":"de8f29bfb6e2690eba862856a8408213980ab74d48d0d5cf6a8691d1f2c05898"} Nov 25 16:59:44 crc kubenswrapper[4743]: I1125 16:59:44.190257 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn9rn" event={"ID":"4c44a79a-74c8-492d-b496-e25610aa214f","Type":"ContainerStarted","Data":"503c8986337cd20735e89df03549228493b2ce69178debff5078a5be9ab26c57"} Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.065953 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt_8f0d30a7-3fb8-4595-a303-9490f5a78667/util/0.log" Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.178460 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt_8f0d30a7-3fb8-4595-a303-9490f5a78667/util/0.log" Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.201174 4743 generic.go:334] "Generic (PLEG): container finished" podID="4c44a79a-74c8-492d-b496-e25610aa214f" containerID="503c8986337cd20735e89df03549228493b2ce69178debff5078a5be9ab26c57" exitCode=0 Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.201250 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn9rn" event={"ID":"4c44a79a-74c8-492d-b496-e25610aa214f","Type":"ContainerDied","Data":"503c8986337cd20735e89df03549228493b2ce69178debff5078a5be9ab26c57"} Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.244948 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt_8f0d30a7-3fb8-4595-a303-9490f5a78667/pull/0.log" Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.246743 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt_8f0d30a7-3fb8-4595-a303-9490f5a78667/pull/0.log" Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.417397 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt_8f0d30a7-3fb8-4595-a303-9490f5a78667/pull/0.log" Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.438821 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt_8f0d30a7-3fb8-4595-a303-9490f5a78667/util/0.log" Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.439170 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt_8f0d30a7-3fb8-4595-a303-9490f5a78667/extract/0.log" Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.591172 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vz76_48cfc5c0-943f-4d89-80f1-bc08e3c3a589/extract-utilities/0.log" Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.753367 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vz76_48cfc5c0-943f-4d89-80f1-bc08e3c3a589/extract-utilities/0.log" Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.775197 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 16:59:45 crc kubenswrapper[4743]: E1125 16:59:45.775630 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.802722 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vz76_48cfc5c0-943f-4d89-80f1-bc08e3c3a589/extract-content/0.log" Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.830998 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vz76_48cfc5c0-943f-4d89-80f1-bc08e3c3a589/extract-content/0.log" Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.931575 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vz76_48cfc5c0-943f-4d89-80f1-bc08e3c3a589/extract-utilities/0.log" Nov 25 16:59:45 crc kubenswrapper[4743]: I1125 16:59:45.945352 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vz76_48cfc5c0-943f-4d89-80f1-bc08e3c3a589/extract-content/0.log" Nov 25 16:59:46 crc kubenswrapper[4743]: I1125 16:59:46.131816 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsk2l_5bf56610-e316-490a-b030-094e92f0f76d/extract-utilities/0.log" Nov 25 16:59:46 crc kubenswrapper[4743]: I1125 16:59:46.211137 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn9rn" event={"ID":"4c44a79a-74c8-492d-b496-e25610aa214f","Type":"ContainerStarted","Data":"41bea7cb0f785feb402279cbdef551f50bb3e336afa75a21729bb7388c190fdc"} Nov 25 16:59:46 crc kubenswrapper[4743]: I1125 16:59:46.243435 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cn9rn" podStartSLOduration=2.840757108 podStartE2EDuration="5.243412786s" podCreationTimestamp="2025-11-25 16:59:41 +0000 UTC" firstStartedPulling="2025-11-25 16:59:43.181790762 +0000 UTC m=+3662.303630311" lastFinishedPulling="2025-11-25 16:59:45.58444644 +0000 UTC m=+3664.706285989" observedRunningTime="2025-11-25 16:59:46.231553092 +0000 UTC m=+3665.353392641" watchObservedRunningTime="2025-11-25 16:59:46.243412786 +0000 UTC m=+3665.365252335" Nov 25 16:59:46 crc kubenswrapper[4743]: I1125 16:59:46.374481 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vz76_48cfc5c0-943f-4d89-80f1-bc08e3c3a589/registry-server/0.log" Nov 25 16:59:46 crc kubenswrapper[4743]: I1125 16:59:46.406988 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsk2l_5bf56610-e316-490a-b030-094e92f0f76d/extract-content/0.log" Nov 25 16:59:46 crc kubenswrapper[4743]: I1125 16:59:46.429829 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsk2l_5bf56610-e316-490a-b030-094e92f0f76d/extract-utilities/0.log" Nov 25 16:59:46 crc kubenswrapper[4743]: I1125 16:59:46.466987 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsk2l_5bf56610-e316-490a-b030-094e92f0f76d/extract-content/0.log" Nov 25 16:59:46 crc kubenswrapper[4743]: I1125 16:59:46.636310 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsk2l_5bf56610-e316-490a-b030-094e92f0f76d/extract-content/0.log" Nov 25 16:59:46 crc kubenswrapper[4743]: I1125 16:59:46.687402 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsk2l_5bf56610-e316-490a-b030-094e92f0f76d/extract-utilities/0.log" Nov 25 16:59:46 crc kubenswrapper[4743]: I1125 16:59:46.832136 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j_25b1c299-a6a1-4afe-a1f8-9c04410a01a0/util/0.log" Nov 25 16:59:47 crc kubenswrapper[4743]: I1125 16:59:47.102250 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j_25b1c299-a6a1-4afe-a1f8-9c04410a01a0/pull/0.log" Nov 25 16:59:47 crc kubenswrapper[4743]: I1125 16:59:47.140426 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j_25b1c299-a6a1-4afe-a1f8-9c04410a01a0/util/0.log" Nov 25 16:59:47 crc kubenswrapper[4743]: I1125 16:59:47.186511 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j_25b1c299-a6a1-4afe-a1f8-9c04410a01a0/pull/0.log" Nov 25 16:59:47 crc kubenswrapper[4743]: I1125 16:59:47.403385 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j_25b1c299-a6a1-4afe-a1f8-9c04410a01a0/util/0.log" Nov 25 16:59:47 crc kubenswrapper[4743]: I1125 16:59:47.440687 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsk2l_5bf56610-e316-490a-b030-094e92f0f76d/registry-server/0.log" Nov 25 16:59:47 crc kubenswrapper[4743]: I1125 16:59:47.443076 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j_25b1c299-a6a1-4afe-a1f8-9c04410a01a0/extract/0.log" Nov 25 16:59:47 crc kubenswrapper[4743]: I1125 16:59:47.447226 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j_25b1c299-a6a1-4afe-a1f8-9c04410a01a0/pull/0.log" Nov 25 16:59:47 crc kubenswrapper[4743]: I1125 16:59:47.660832 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-r4ckx_84c01433-ed4b-4b70-8473-7905b701f657/marketplace-operator/0.log" Nov 25 16:59:47 crc kubenswrapper[4743]: I1125 16:59:47.669513 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kbcvk_033a7590-8333-4c20-8f6a-71c2f7410c3f/extract-utilities/0.log" Nov 25 16:59:47 crc kubenswrapper[4743]: I1125 16:59:47.904470 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kbcvk_033a7590-8333-4c20-8f6a-71c2f7410c3f/extract-content/0.log" Nov 25 16:59:47 crc kubenswrapper[4743]: I1125 16:59:47.904471 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kbcvk_033a7590-8333-4c20-8f6a-71c2f7410c3f/extract-content/0.log" Nov 25 16:59:47 crc kubenswrapper[4743]: I1125 16:59:47.906974 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kbcvk_033a7590-8333-4c20-8f6a-71c2f7410c3f/extract-utilities/0.log" Nov 25 16:59:48 crc kubenswrapper[4743]: I1125 16:59:48.084920 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kbcvk_033a7590-8333-4c20-8f6a-71c2f7410c3f/extract-utilities/0.log" Nov 25 16:59:48 crc kubenswrapper[4743]: I1125 16:59:48.128609 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kbcvk_033a7590-8333-4c20-8f6a-71c2f7410c3f/extract-content/0.log" Nov 25 16:59:48 crc kubenswrapper[4743]: I1125 16:59:48.307643 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kbcvk_033a7590-8333-4c20-8f6a-71c2f7410c3f/registry-server/0.log" Nov 25 16:59:48 crc kubenswrapper[4743]: I1125 16:59:48.390896 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cn9rn_4c44a79a-74c8-492d-b496-e25610aa214f/extract-utilities/0.log" Nov 25 16:59:48 crc kubenswrapper[4743]: I1125 16:59:48.525040 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cn9rn_4c44a79a-74c8-492d-b496-e25610aa214f/extract-utilities/0.log" Nov 25 16:59:48 crc kubenswrapper[4743]: I1125 16:59:48.566243 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cn9rn_4c44a79a-74c8-492d-b496-e25610aa214f/extract-content/0.log" Nov 25 16:59:48 crc kubenswrapper[4743]: I1125 16:59:48.578394 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cn9rn_4c44a79a-74c8-492d-b496-e25610aa214f/extract-content/0.log" Nov 25 16:59:48 crc kubenswrapper[4743]: I1125 16:59:48.742842 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cn9rn_4c44a79a-74c8-492d-b496-e25610aa214f/extract-utilities/0.log" Nov 25 16:59:48 crc kubenswrapper[4743]: I1125 16:59:48.782674 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cn9rn_4c44a79a-74c8-492d-b496-e25610aa214f/registry-server/0.log" Nov 25 16:59:48 crc kubenswrapper[4743]: I1125 16:59:48.786062 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gmt44_f01f0e90-72f1-4251-b010-4f32a5ba0741/extract-utilities/0.log" Nov 25 16:59:48 crc kubenswrapper[4743]: I1125 16:59:48.800552 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cn9rn_4c44a79a-74c8-492d-b496-e25610aa214f/extract-content/0.log" Nov 25 16:59:49 crc kubenswrapper[4743]: I1125 16:59:49.121937 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gmt44_f01f0e90-72f1-4251-b010-4f32a5ba0741/extract-content/0.log" Nov 25 16:59:49 crc kubenswrapper[4743]: I1125 16:59:49.129968 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gmt44_f01f0e90-72f1-4251-b010-4f32a5ba0741/extract-utilities/0.log" Nov 25 16:59:49 crc kubenswrapper[4743]: I1125 16:59:49.137097 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gmt44_f01f0e90-72f1-4251-b010-4f32a5ba0741/extract-content/0.log" Nov 25 16:59:49 crc kubenswrapper[4743]: I1125 16:59:49.290959 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gmt44_f01f0e90-72f1-4251-b010-4f32a5ba0741/extract-content/0.log" Nov 25 16:59:49 crc kubenswrapper[4743]: I1125 16:59:49.325446 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gmt44_f01f0e90-72f1-4251-b010-4f32a5ba0741/extract-utilities/0.log" Nov 25 16:59:49 crc kubenswrapper[4743]: I1125 16:59:49.772849 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gmt44_f01f0e90-72f1-4251-b010-4f32a5ba0741/registry-server/0.log" Nov 25 16:59:52 crc kubenswrapper[4743]: I1125 16:59:52.219821 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:52 crc kubenswrapper[4743]: I1125 16:59:52.220147 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:52 crc kubenswrapper[4743]: I1125 16:59:52.271044 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:52 crc kubenswrapper[4743]: I1125 16:59:52.317169 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:52 crc kubenswrapper[4743]: I1125 16:59:52.506223 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cn9rn"] Nov 25 16:59:54 crc kubenswrapper[4743]: I1125 16:59:54.275209 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cn9rn" podUID="4c44a79a-74c8-492d-b496-e25610aa214f" containerName="registry-server" containerID="cri-o://41bea7cb0f785feb402279cbdef551f50bb3e336afa75a21729bb7388c190fdc" gracePeriod=2 Nov 25 16:59:54 crc kubenswrapper[4743]: I1125 16:59:54.719838 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:54 crc kubenswrapper[4743]: I1125 16:59:54.854473 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c44a79a-74c8-492d-b496-e25610aa214f-catalog-content\") pod \"4c44a79a-74c8-492d-b496-e25610aa214f\" (UID: \"4c44a79a-74c8-492d-b496-e25610aa214f\") " Nov 25 16:59:54 crc kubenswrapper[4743]: I1125 16:59:54.854527 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sslt\" (UniqueName: \"kubernetes.io/projected/4c44a79a-74c8-492d-b496-e25610aa214f-kube-api-access-9sslt\") pod \"4c44a79a-74c8-492d-b496-e25610aa214f\" (UID: \"4c44a79a-74c8-492d-b496-e25610aa214f\") " Nov 25 16:59:54 crc kubenswrapper[4743]: I1125 16:59:54.854743 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c44a79a-74c8-492d-b496-e25610aa214f-utilities\") pod \"4c44a79a-74c8-492d-b496-e25610aa214f\" (UID: \"4c44a79a-74c8-492d-b496-e25610aa214f\") " Nov 25 16:59:54 crc kubenswrapper[4743]: I1125 16:59:54.855400 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c44a79a-74c8-492d-b496-e25610aa214f-utilities" (OuterVolumeSpecName: "utilities") pod "4c44a79a-74c8-492d-b496-e25610aa214f" (UID: "4c44a79a-74c8-492d-b496-e25610aa214f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:59:54 crc kubenswrapper[4743]: I1125 16:59:54.861648 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c44a79a-74c8-492d-b496-e25610aa214f-kube-api-access-9sslt" (OuterVolumeSpecName: "kube-api-access-9sslt") pod "4c44a79a-74c8-492d-b496-e25610aa214f" (UID: "4c44a79a-74c8-492d-b496-e25610aa214f"). InnerVolumeSpecName "kube-api-access-9sslt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:59:54 crc kubenswrapper[4743]: I1125 16:59:54.948504 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c44a79a-74c8-492d-b496-e25610aa214f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c44a79a-74c8-492d-b496-e25610aa214f" (UID: "4c44a79a-74c8-492d-b496-e25610aa214f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:59:54 crc kubenswrapper[4743]: I1125 16:59:54.957422 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c44a79a-74c8-492d-b496-e25610aa214f-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:59:54 crc kubenswrapper[4743]: I1125 16:59:54.957471 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c44a79a-74c8-492d-b496-e25610aa214f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:59:54 crc kubenswrapper[4743]: I1125 16:59:54.957485 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sslt\" (UniqueName: \"kubernetes.io/projected/4c44a79a-74c8-492d-b496-e25610aa214f-kube-api-access-9sslt\") on node \"crc\" DevicePath \"\"" Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.285130 4743 generic.go:334] "Generic (PLEG): container finished" podID="4c44a79a-74c8-492d-b496-e25610aa214f" containerID="41bea7cb0f785feb402279cbdef551f50bb3e336afa75a21729bb7388c190fdc" exitCode=0 Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.285178 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn9rn" event={"ID":"4c44a79a-74c8-492d-b496-e25610aa214f","Type":"ContainerDied","Data":"41bea7cb0f785feb402279cbdef551f50bb3e336afa75a21729bb7388c190fdc"} Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.285189 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn9rn" Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.285210 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn9rn" event={"ID":"4c44a79a-74c8-492d-b496-e25610aa214f","Type":"ContainerDied","Data":"de8f29bfb6e2690eba862856a8408213980ab74d48d0d5cf6a8691d1f2c05898"} Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.285229 4743 scope.go:117] "RemoveContainer" containerID="41bea7cb0f785feb402279cbdef551f50bb3e336afa75a21729bb7388c190fdc" Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.311275 4743 scope.go:117] "RemoveContainer" containerID="503c8986337cd20735e89df03549228493b2ce69178debff5078a5be9ab26c57" Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.327123 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cn9rn"] Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.337330 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cn9rn"] Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.344045 4743 scope.go:117] "RemoveContainer" containerID="d3ffb500b9e6513636d5cdab446d78e7fab4ae6a25128a45fb3bbe4d8fbe65d5" Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.380103 4743 scope.go:117] "RemoveContainer" containerID="41bea7cb0f785feb402279cbdef551f50bb3e336afa75a21729bb7388c190fdc" Nov 25 16:59:55 crc kubenswrapper[4743]: E1125 16:59:55.380526 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41bea7cb0f785feb402279cbdef551f50bb3e336afa75a21729bb7388c190fdc\": container with ID starting with 41bea7cb0f785feb402279cbdef551f50bb3e336afa75a21729bb7388c190fdc not found: ID does not exist" containerID="41bea7cb0f785feb402279cbdef551f50bb3e336afa75a21729bb7388c190fdc" Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.380572 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41bea7cb0f785feb402279cbdef551f50bb3e336afa75a21729bb7388c190fdc"} err="failed to get container status \"41bea7cb0f785feb402279cbdef551f50bb3e336afa75a21729bb7388c190fdc\": rpc error: code = NotFound desc = could not find container \"41bea7cb0f785feb402279cbdef551f50bb3e336afa75a21729bb7388c190fdc\": container with ID starting with 41bea7cb0f785feb402279cbdef551f50bb3e336afa75a21729bb7388c190fdc not found: ID does not exist" Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.380680 4743 scope.go:117] "RemoveContainer" containerID="503c8986337cd20735e89df03549228493b2ce69178debff5078a5be9ab26c57" Nov 25 16:59:55 crc kubenswrapper[4743]: E1125 16:59:55.381081 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"503c8986337cd20735e89df03549228493b2ce69178debff5078a5be9ab26c57\": container with ID starting with 503c8986337cd20735e89df03549228493b2ce69178debff5078a5be9ab26c57 not found: ID does not exist" containerID="503c8986337cd20735e89df03549228493b2ce69178debff5078a5be9ab26c57" Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.381117 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503c8986337cd20735e89df03549228493b2ce69178debff5078a5be9ab26c57"} err="failed to get container status \"503c8986337cd20735e89df03549228493b2ce69178debff5078a5be9ab26c57\": rpc error: code = NotFound desc = could not find container \"503c8986337cd20735e89df03549228493b2ce69178debff5078a5be9ab26c57\": container with ID starting with 503c8986337cd20735e89df03549228493b2ce69178debff5078a5be9ab26c57 not found: ID does not exist" Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.381144 4743 scope.go:117] "RemoveContainer" containerID="d3ffb500b9e6513636d5cdab446d78e7fab4ae6a25128a45fb3bbe4d8fbe65d5" Nov 25 16:59:55 crc kubenswrapper[4743]: E1125 16:59:55.381406 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3ffb500b9e6513636d5cdab446d78e7fab4ae6a25128a45fb3bbe4d8fbe65d5\": container with ID starting with d3ffb500b9e6513636d5cdab446d78e7fab4ae6a25128a45fb3bbe4d8fbe65d5 not found: ID does not exist" containerID="d3ffb500b9e6513636d5cdab446d78e7fab4ae6a25128a45fb3bbe4d8fbe65d5" Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.381449 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ffb500b9e6513636d5cdab446d78e7fab4ae6a25128a45fb3bbe4d8fbe65d5"} err="failed to get container status \"d3ffb500b9e6513636d5cdab446d78e7fab4ae6a25128a45fb3bbe4d8fbe65d5\": rpc error: code = NotFound desc = could not find container \"d3ffb500b9e6513636d5cdab446d78e7fab4ae6a25128a45fb3bbe4d8fbe65d5\": container with ID starting with d3ffb500b9e6513636d5cdab446d78e7fab4ae6a25128a45fb3bbe4d8fbe65d5 not found: ID does not exist" Nov 25 16:59:55 crc kubenswrapper[4743]: I1125 16:59:55.788964 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c44a79a-74c8-492d-b496-e25610aa214f" path="/var/lib/kubelet/pods/4c44a79a-74c8-492d-b496-e25610aa214f/volumes" Nov 25 16:59:57 crc kubenswrapper[4743]: I1125 16:59:57.775856 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 16:59:57 crc kubenswrapper[4743]: E1125 16:59:57.776179 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.173565 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw"] Nov 25 17:00:00 crc kubenswrapper[4743]: E1125 17:00:00.174559 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c44a79a-74c8-492d-b496-e25610aa214f" containerName="extract-utilities" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.174571 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c44a79a-74c8-492d-b496-e25610aa214f" containerName="extract-utilities" Nov 25 17:00:00 crc kubenswrapper[4743]: E1125 17:00:00.174611 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c44a79a-74c8-492d-b496-e25610aa214f" containerName="extract-content" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.174617 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c44a79a-74c8-492d-b496-e25610aa214f" containerName="extract-content" Nov 25 17:00:00 crc kubenswrapper[4743]: E1125 17:00:00.174646 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c44a79a-74c8-492d-b496-e25610aa214f" containerName="registry-server" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.174653 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c44a79a-74c8-492d-b496-e25610aa214f" containerName="registry-server" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.174837 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c44a79a-74c8-492d-b496-e25610aa214f" containerName="registry-server" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.175450 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.184728 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.185462 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.187701 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw"] Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.255095 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-config-volume\") pod \"collect-profiles-29401500-4kscw\" (UID: \"700f32cb-f97e-4dec-8d5b-f093a3abf4fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.255204 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-secret-volume\") pod \"collect-profiles-29401500-4kscw\" (UID: \"700f32cb-f97e-4dec-8d5b-f093a3abf4fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.255431 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx49r\" (UniqueName: \"kubernetes.io/projected/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-kube-api-access-lx49r\") pod \"collect-profiles-29401500-4kscw\" (UID: \"700f32cb-f97e-4dec-8d5b-f093a3abf4fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.357308 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx49r\" (UniqueName: \"kubernetes.io/projected/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-kube-api-access-lx49r\") pod \"collect-profiles-29401500-4kscw\" (UID: \"700f32cb-f97e-4dec-8d5b-f093a3abf4fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.357437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-config-volume\") pod \"collect-profiles-29401500-4kscw\" (UID: \"700f32cb-f97e-4dec-8d5b-f093a3abf4fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.357480 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-secret-volume\") pod \"collect-profiles-29401500-4kscw\" (UID: \"700f32cb-f97e-4dec-8d5b-f093a3abf4fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.358554 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-config-volume\") pod \"collect-profiles-29401500-4kscw\" (UID: \"700f32cb-f97e-4dec-8d5b-f093a3abf4fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.364794 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-secret-volume\") pod \"collect-profiles-29401500-4kscw\" (UID: \"700f32cb-f97e-4dec-8d5b-f093a3abf4fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.377086 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx49r\" (UniqueName: \"kubernetes.io/projected/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-kube-api-access-lx49r\") pod \"collect-profiles-29401500-4kscw\" (UID: \"700f32cb-f97e-4dec-8d5b-f093a3abf4fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.493560 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" Nov 25 17:00:00 crc kubenswrapper[4743]: I1125 17:00:00.963524 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw"] Nov 25 17:00:01 crc kubenswrapper[4743]: I1125 17:00:01.336052 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" event={"ID":"700f32cb-f97e-4dec-8d5b-f093a3abf4fc","Type":"ContainerStarted","Data":"5fca939ad27fa36f239ea043684186c131fd4e4b5a676c6425bac3092210999d"} Nov 25 17:00:01 crc kubenswrapper[4743]: I1125 17:00:01.336103 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" event={"ID":"700f32cb-f97e-4dec-8d5b-f093a3abf4fc","Type":"ContainerStarted","Data":"2bffe919919f6b78c5f9088a945e48043ed49f28789403cee19e87b1678ddb56"} Nov 25 17:00:01 crc kubenswrapper[4743]: I1125 17:00:01.354878 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" podStartSLOduration=1.354858676 podStartE2EDuration="1.354858676s" podCreationTimestamp="2025-11-25 17:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 17:00:01.349424885 +0000 UTC m=+3680.471264444" watchObservedRunningTime="2025-11-25 17:00:01.354858676 +0000 UTC m=+3680.476698225" Nov 25 17:00:02 crc kubenswrapper[4743]: I1125 17:00:02.346885 4743 generic.go:334] "Generic (PLEG): container finished" podID="700f32cb-f97e-4dec-8d5b-f093a3abf4fc" containerID="5fca939ad27fa36f239ea043684186c131fd4e4b5a676c6425bac3092210999d" exitCode=0 Nov 25 17:00:02 crc kubenswrapper[4743]: I1125 17:00:02.346975 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" event={"ID":"700f32cb-f97e-4dec-8d5b-f093a3abf4fc","Type":"ContainerDied","Data":"5fca939ad27fa36f239ea043684186c131fd4e4b5a676c6425bac3092210999d"} Nov 25 17:00:03 crc kubenswrapper[4743]: I1125 17:00:03.778964 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" Nov 25 17:00:03 crc kubenswrapper[4743]: I1125 17:00:03.922797 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx49r\" (UniqueName: \"kubernetes.io/projected/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-kube-api-access-lx49r\") pod \"700f32cb-f97e-4dec-8d5b-f093a3abf4fc\" (UID: \"700f32cb-f97e-4dec-8d5b-f093a3abf4fc\") " Nov 25 17:00:03 crc kubenswrapper[4743]: I1125 17:00:03.922999 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-secret-volume\") pod \"700f32cb-f97e-4dec-8d5b-f093a3abf4fc\" (UID: \"700f32cb-f97e-4dec-8d5b-f093a3abf4fc\") " Nov 25 17:00:03 crc kubenswrapper[4743]: I1125 17:00:03.923109 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-config-volume\") pod \"700f32cb-f97e-4dec-8d5b-f093a3abf4fc\" (UID: \"700f32cb-f97e-4dec-8d5b-f093a3abf4fc\") " Nov 25 17:00:03 crc kubenswrapper[4743]: I1125 17:00:03.923857 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-config-volume" (OuterVolumeSpecName: "config-volume") pod "700f32cb-f97e-4dec-8d5b-f093a3abf4fc" (UID: "700f32cb-f97e-4dec-8d5b-f093a3abf4fc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 17:00:03 crc kubenswrapper[4743]: I1125 17:00:03.924310 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 17:00:03 crc kubenswrapper[4743]: I1125 17:00:03.943087 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-kube-api-access-lx49r" (OuterVolumeSpecName: "kube-api-access-lx49r") pod "700f32cb-f97e-4dec-8d5b-f093a3abf4fc" (UID: "700f32cb-f97e-4dec-8d5b-f093a3abf4fc"). InnerVolumeSpecName "kube-api-access-lx49r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 17:00:03 crc kubenswrapper[4743]: I1125 17:00:03.952037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "700f32cb-f97e-4dec-8d5b-f093a3abf4fc" (UID: "700f32cb-f97e-4dec-8d5b-f093a3abf4fc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 17:00:04 crc kubenswrapper[4743]: I1125 17:00:04.025872 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx49r\" (UniqueName: \"kubernetes.io/projected/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-kube-api-access-lx49r\") on node \"crc\" DevicePath \"\"" Nov 25 17:00:04 crc kubenswrapper[4743]: I1125 17:00:04.025902 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/700f32cb-f97e-4dec-8d5b-f093a3abf4fc-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 17:00:04 crc kubenswrapper[4743]: I1125 17:00:04.363777 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" event={"ID":"700f32cb-f97e-4dec-8d5b-f093a3abf4fc","Type":"ContainerDied","Data":"2bffe919919f6b78c5f9088a945e48043ed49f28789403cee19e87b1678ddb56"} Nov 25 17:00:04 crc kubenswrapper[4743]: I1125 17:00:04.363816 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bffe919919f6b78c5f9088a945e48043ed49f28789403cee19e87b1678ddb56" Nov 25 17:00:04 crc kubenswrapper[4743]: I1125 17:00:04.363870 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401500-4kscw" Nov 25 17:00:04 crc kubenswrapper[4743]: I1125 17:00:04.434932 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56"] Nov 25 17:00:04 crc kubenswrapper[4743]: I1125 17:00:04.445529 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401455-nvm56"] Nov 25 17:00:05 crc kubenswrapper[4743]: I1125 17:00:05.788229 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d70c16b3-cf20-4bfe-a816-00d5fd9c2885" path="/var/lib/kubelet/pods/d70c16b3-cf20-4bfe-a816-00d5fd9c2885/volumes" Nov 25 17:00:10 crc kubenswrapper[4743]: I1125 17:00:10.775414 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:00:10 crc kubenswrapper[4743]: E1125 17:00:10.776182 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:00:11 crc kubenswrapper[4743]: I1125 17:00:11.169781 4743 scope.go:117] "RemoveContainer" containerID="9cc92ade8000551cfd97686c8040d267c296253f7248447560dd0af4dfa0e206" Nov 25 17:00:22 crc kubenswrapper[4743]: I1125 17:00:22.776184 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:00:22 crc kubenswrapper[4743]: E1125 17:00:22.777218 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:00:34 crc kubenswrapper[4743]: I1125 17:00:34.775709 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:00:34 crc kubenswrapper[4743]: E1125 17:00:34.776750 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:00:49 crc kubenswrapper[4743]: I1125 17:00:49.774692 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:00:49 crc kubenswrapper[4743]: E1125 17:00:49.775450 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.153427 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29401501-bfhng"] Nov 25 17:01:00 crc kubenswrapper[4743]: E1125 17:01:00.162129 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700f32cb-f97e-4dec-8d5b-f093a3abf4fc" containerName="collect-profiles" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.162187 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="700f32cb-f97e-4dec-8d5b-f093a3abf4fc" containerName="collect-profiles" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.162359 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="700f32cb-f97e-4dec-8d5b-f093a3abf4fc" containerName="collect-profiles" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.162966 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.166218 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29401501-bfhng"] Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.268658 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsxqb\" (UniqueName: \"kubernetes.io/projected/e84e92f3-1680-4cac-b59e-1b783e572d24-kube-api-access-wsxqb\") pod \"keystone-cron-29401501-bfhng\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.268927 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-fernet-keys\") pod \"keystone-cron-29401501-bfhng\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.269272 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-config-data\") pod \"keystone-cron-29401501-bfhng\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.269324 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-combined-ca-bundle\") pod \"keystone-cron-29401501-bfhng\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.370802 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-config-data\") pod \"keystone-cron-29401501-bfhng\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.370845 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-combined-ca-bundle\") pod \"keystone-cron-29401501-bfhng\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.370906 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsxqb\" (UniqueName: \"kubernetes.io/projected/e84e92f3-1680-4cac-b59e-1b783e572d24-kube-api-access-wsxqb\") pod \"keystone-cron-29401501-bfhng\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.370954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-fernet-keys\") pod \"keystone-cron-29401501-bfhng\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.377304 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-combined-ca-bundle\") pod \"keystone-cron-29401501-bfhng\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.377320 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-fernet-keys\") pod \"keystone-cron-29401501-bfhng\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.379339 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-config-data\") pod \"keystone-cron-29401501-bfhng\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.388087 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsxqb\" (UniqueName: \"kubernetes.io/projected/e84e92f3-1680-4cac-b59e-1b783e572d24-kube-api-access-wsxqb\") pod \"keystone-cron-29401501-bfhng\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.485673 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:00 crc kubenswrapper[4743]: I1125 17:01:00.935453 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29401501-bfhng"] Nov 25 17:01:01 crc kubenswrapper[4743]: I1125 17:01:01.783797 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:01:01 crc kubenswrapper[4743]: E1125 17:01:01.795654 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:01:01 crc kubenswrapper[4743]: I1125 17:01:01.866889 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29401501-bfhng" event={"ID":"e84e92f3-1680-4cac-b59e-1b783e572d24","Type":"ContainerStarted","Data":"b1e96133e0bd9ae406c8f5c27b819e3ff8b8182d5b001e4561f171aa3ce70aaf"} Nov 25 17:01:01 crc kubenswrapper[4743]: I1125 17:01:01.866932 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29401501-bfhng" event={"ID":"e84e92f3-1680-4cac-b59e-1b783e572d24","Type":"ContainerStarted","Data":"0e621c2e967e139da2ecfce6788c00f4f0e4e487b0b4325699bf5e4cb97ee005"} Nov 25 17:01:01 crc kubenswrapper[4743]: I1125 17:01:01.888329 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29401501-bfhng" podStartSLOduration=1.8883141559999999 podStartE2EDuration="1.888314156s" podCreationTimestamp="2025-11-25 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 17:01:01.887855081 +0000 UTC m=+3741.009694650" watchObservedRunningTime="2025-11-25 17:01:01.888314156 +0000 UTC m=+3741.010153705" Nov 25 17:01:03 crc kubenswrapper[4743]: I1125 17:01:03.891619 4743 generic.go:334] "Generic (PLEG): container finished" podID="e84e92f3-1680-4cac-b59e-1b783e572d24" containerID="b1e96133e0bd9ae406c8f5c27b819e3ff8b8182d5b001e4561f171aa3ce70aaf" exitCode=0 Nov 25 17:01:03 crc kubenswrapper[4743]: I1125 17:01:03.891670 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29401501-bfhng" event={"ID":"e84e92f3-1680-4cac-b59e-1b783e572d24","Type":"ContainerDied","Data":"b1e96133e0bd9ae406c8f5c27b819e3ff8b8182d5b001e4561f171aa3ce70aaf"} Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.204764 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.380870 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-config-data\") pod \"e84e92f3-1680-4cac-b59e-1b783e572d24\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.380943 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsxqb\" (UniqueName: \"kubernetes.io/projected/e84e92f3-1680-4cac-b59e-1b783e572d24-kube-api-access-wsxqb\") pod \"e84e92f3-1680-4cac-b59e-1b783e572d24\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.381032 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-combined-ca-bundle\") pod \"e84e92f3-1680-4cac-b59e-1b783e572d24\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.382472 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-fernet-keys\") pod \"e84e92f3-1680-4cac-b59e-1b783e572d24\" (UID: \"e84e92f3-1680-4cac-b59e-1b783e572d24\") " Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.401772 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e84e92f3-1680-4cac-b59e-1b783e572d24" (UID: "e84e92f3-1680-4cac-b59e-1b783e572d24"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.401879 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e84e92f3-1680-4cac-b59e-1b783e572d24-kube-api-access-wsxqb" (OuterVolumeSpecName: "kube-api-access-wsxqb") pod "e84e92f3-1680-4cac-b59e-1b783e572d24" (UID: "e84e92f3-1680-4cac-b59e-1b783e572d24"). InnerVolumeSpecName "kube-api-access-wsxqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.411143 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e84e92f3-1680-4cac-b59e-1b783e572d24" (UID: "e84e92f3-1680-4cac-b59e-1b783e572d24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.463924 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-config-data" (OuterVolumeSpecName: "config-data") pod "e84e92f3-1680-4cac-b59e-1b783e572d24" (UID: "e84e92f3-1680-4cac-b59e-1b783e572d24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.486184 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsxqb\" (UniqueName: \"kubernetes.io/projected/e84e92f3-1680-4cac-b59e-1b783e572d24-kube-api-access-wsxqb\") on node \"crc\" DevicePath \"\"" Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.486318 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.486348 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.486375 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e84e92f3-1680-4cac-b59e-1b783e572d24-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.908646 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29401501-bfhng" event={"ID":"e84e92f3-1680-4cac-b59e-1b783e572d24","Type":"ContainerDied","Data":"0e621c2e967e139da2ecfce6788c00f4f0e4e487b0b4325699bf5e4cb97ee005"} Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.908988 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e621c2e967e139da2ecfce6788c00f4f0e4e487b0b4325699bf5e4cb97ee005" Nov 25 17:01:05 crc kubenswrapper[4743]: I1125 17:01:05.908746 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29401501-bfhng" Nov 25 17:01:16 crc kubenswrapper[4743]: I1125 17:01:16.777581 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:01:16 crc kubenswrapper[4743]: E1125 17:01:16.779380 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:01:27 crc kubenswrapper[4743]: I1125 17:01:27.775054 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:01:27 crc kubenswrapper[4743]: E1125 17:01:27.775872 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:01:29 crc kubenswrapper[4743]: I1125 17:01:29.158808 4743 generic.go:334] "Generic (PLEG): container finished" podID="2bb7ee6d-6062-455a-b0cd-39aa268fc029" containerID="250405a547a3afcdefe2a350a3e472490e9da2e17f80a4884842306a6f680c83" exitCode=0 Nov 25 17:01:29 crc kubenswrapper[4743]: I1125 17:01:29.158887 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9x9kh/must-gather-t9zq6" event={"ID":"2bb7ee6d-6062-455a-b0cd-39aa268fc029","Type":"ContainerDied","Data":"250405a547a3afcdefe2a350a3e472490e9da2e17f80a4884842306a6f680c83"} Nov 25 17:01:29 crc kubenswrapper[4743]: I1125 17:01:29.160352 4743 scope.go:117] "RemoveContainer" containerID="250405a547a3afcdefe2a350a3e472490e9da2e17f80a4884842306a6f680c83" Nov 25 17:01:30 crc kubenswrapper[4743]: I1125 17:01:30.011316 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9x9kh_must-gather-t9zq6_2bb7ee6d-6062-455a-b0cd-39aa268fc029/gather/0.log" Nov 25 17:01:37 crc kubenswrapper[4743]: I1125 17:01:37.680907 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9x9kh/must-gather-t9zq6"] Nov 25 17:01:37 crc kubenswrapper[4743]: I1125 17:01:37.681526 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9x9kh/must-gather-t9zq6" podUID="2bb7ee6d-6062-455a-b0cd-39aa268fc029" containerName="copy" containerID="cri-o://8ba09bf397bc5d662154ff9ba1f98f6fbad99242cef84902b289fad766755e4a" gracePeriod=2 Nov 25 17:01:37 crc kubenswrapper[4743]: I1125 17:01:37.689931 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9x9kh/must-gather-t9zq6"] Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.145572 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9x9kh_must-gather-t9zq6_2bb7ee6d-6062-455a-b0cd-39aa268fc029/copy/0.log" Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.146306 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/must-gather-t9zq6" Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.243927 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9x9kh_must-gather-t9zq6_2bb7ee6d-6062-455a-b0cd-39aa268fc029/copy/0.log" Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.244426 4743 generic.go:334] "Generic (PLEG): container finished" podID="2bb7ee6d-6062-455a-b0cd-39aa268fc029" containerID="8ba09bf397bc5d662154ff9ba1f98f6fbad99242cef84902b289fad766755e4a" exitCode=143 Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.244469 4743 scope.go:117] "RemoveContainer" containerID="8ba09bf397bc5d662154ff9ba1f98f6fbad99242cef84902b289fad766755e4a" Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.244470 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9x9kh/must-gather-t9zq6" Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.256523 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2bb7ee6d-6062-455a-b0cd-39aa268fc029-must-gather-output\") pod \"2bb7ee6d-6062-455a-b0cd-39aa268fc029\" (UID: \"2bb7ee6d-6062-455a-b0cd-39aa268fc029\") " Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.256694 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fjls\" (UniqueName: \"kubernetes.io/projected/2bb7ee6d-6062-455a-b0cd-39aa268fc029-kube-api-access-5fjls\") pod \"2bb7ee6d-6062-455a-b0cd-39aa268fc029\" (UID: \"2bb7ee6d-6062-455a-b0cd-39aa268fc029\") " Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.262146 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb7ee6d-6062-455a-b0cd-39aa268fc029-kube-api-access-5fjls" (OuterVolumeSpecName: "kube-api-access-5fjls") pod "2bb7ee6d-6062-455a-b0cd-39aa268fc029" (UID: "2bb7ee6d-6062-455a-b0cd-39aa268fc029"). InnerVolumeSpecName "kube-api-access-5fjls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.262407 4743 scope.go:117] "RemoveContainer" containerID="250405a547a3afcdefe2a350a3e472490e9da2e17f80a4884842306a6f680c83" Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.342746 4743 scope.go:117] "RemoveContainer" containerID="8ba09bf397bc5d662154ff9ba1f98f6fbad99242cef84902b289fad766755e4a" Nov 25 17:01:38 crc kubenswrapper[4743]: E1125 17:01:38.343415 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ba09bf397bc5d662154ff9ba1f98f6fbad99242cef84902b289fad766755e4a\": container with ID starting with 8ba09bf397bc5d662154ff9ba1f98f6fbad99242cef84902b289fad766755e4a not found: ID does not exist" containerID="8ba09bf397bc5d662154ff9ba1f98f6fbad99242cef84902b289fad766755e4a" Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.343468 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ba09bf397bc5d662154ff9ba1f98f6fbad99242cef84902b289fad766755e4a"} err="failed to get container status \"8ba09bf397bc5d662154ff9ba1f98f6fbad99242cef84902b289fad766755e4a\": rpc error: code = NotFound desc = could not find container \"8ba09bf397bc5d662154ff9ba1f98f6fbad99242cef84902b289fad766755e4a\": container with ID starting with 8ba09bf397bc5d662154ff9ba1f98f6fbad99242cef84902b289fad766755e4a not found: ID does not exist" Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.343498 4743 scope.go:117] "RemoveContainer" containerID="250405a547a3afcdefe2a350a3e472490e9da2e17f80a4884842306a6f680c83" Nov 25 17:01:38 crc kubenswrapper[4743]: E1125 17:01:38.343947 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"250405a547a3afcdefe2a350a3e472490e9da2e17f80a4884842306a6f680c83\": container with ID starting with 250405a547a3afcdefe2a350a3e472490e9da2e17f80a4884842306a6f680c83 not found: ID does not exist" containerID="250405a547a3afcdefe2a350a3e472490e9da2e17f80a4884842306a6f680c83" Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.343994 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250405a547a3afcdefe2a350a3e472490e9da2e17f80a4884842306a6f680c83"} err="failed to get container status \"250405a547a3afcdefe2a350a3e472490e9da2e17f80a4884842306a6f680c83\": rpc error: code = NotFound desc = could not find container \"250405a547a3afcdefe2a350a3e472490e9da2e17f80a4884842306a6f680c83\": container with ID starting with 250405a547a3afcdefe2a350a3e472490e9da2e17f80a4884842306a6f680c83 not found: ID does not exist" Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.359928 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fjls\" (UniqueName: \"kubernetes.io/projected/2bb7ee6d-6062-455a-b0cd-39aa268fc029-kube-api-access-5fjls\") on node \"crc\" DevicePath \"\"" Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.390785 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb7ee6d-6062-455a-b0cd-39aa268fc029-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2bb7ee6d-6062-455a-b0cd-39aa268fc029" (UID: "2bb7ee6d-6062-455a-b0cd-39aa268fc029"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 17:01:38 crc kubenswrapper[4743]: I1125 17:01:38.461911 4743 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2bb7ee6d-6062-455a-b0cd-39aa268fc029-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 25 17:01:39 crc kubenswrapper[4743]: I1125 17:01:39.820881 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb7ee6d-6062-455a-b0cd-39aa268fc029" path="/var/lib/kubelet/pods/2bb7ee6d-6062-455a-b0cd-39aa268fc029/volumes" Nov 25 17:01:40 crc kubenswrapper[4743]: I1125 17:01:40.774915 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:01:40 crc kubenswrapper[4743]: E1125 17:01:40.775228 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:01:53 crc kubenswrapper[4743]: I1125 17:01:53.775940 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:01:53 crc kubenswrapper[4743]: E1125 17:01:53.777036 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:02:04 crc kubenswrapper[4743]: I1125 17:02:04.775705 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:02:04 crc kubenswrapper[4743]: E1125 17:02:04.776736 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:02:18 crc kubenswrapper[4743]: I1125 17:02:18.775705 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:02:18 crc kubenswrapper[4743]: E1125 17:02:18.777431 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:02:29 crc kubenswrapper[4743]: I1125 17:02:29.775685 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:02:29 crc kubenswrapper[4743]: E1125 17:02:29.776572 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:02:43 crc kubenswrapper[4743]: I1125 17:02:43.775459 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:02:43 crc kubenswrapper[4743]: E1125 17:02:43.776461 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:02:58 crc kubenswrapper[4743]: I1125 17:02:58.775238 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:02:58 crc kubenswrapper[4743]: E1125 17:02:58.775871 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:03:11 crc kubenswrapper[4743]: I1125 17:03:11.360508 4743 scope.go:117] "RemoveContainer" containerID="e20a738fe814d37ef202a5e404d3bf6da9bf380e3887647ab9440a2eb86aa677" Nov 25 17:03:11 crc kubenswrapper[4743]: I1125 17:03:11.782992 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:03:11 crc kubenswrapper[4743]: E1125 17:03:11.783852 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:03:26 crc kubenswrapper[4743]: I1125 17:03:26.774945 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:03:26 crc kubenswrapper[4743]: E1125 17:03:26.775894 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:03:38 crc kubenswrapper[4743]: I1125 17:03:38.775054 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:03:38 crc kubenswrapper[4743]: E1125 17:03:38.776019 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:03:52 crc kubenswrapper[4743]: I1125 17:03:52.776153 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:03:52 crc kubenswrapper[4743]: E1125 17:03:52.777262 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:04:07 crc kubenswrapper[4743]: I1125 17:04:07.775432 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:04:07 crc kubenswrapper[4743]: E1125 17:04:07.776893 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.487436 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jt59v/must-gather-lzb6c"] Nov 25 17:04:14 crc kubenswrapper[4743]: E1125 17:04:14.489297 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb7ee6d-6062-455a-b0cd-39aa268fc029" containerName="copy" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.489322 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb7ee6d-6062-455a-b0cd-39aa268fc029" containerName="copy" Nov 25 17:04:14 crc kubenswrapper[4743]: E1125 17:04:14.489340 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb7ee6d-6062-455a-b0cd-39aa268fc029" containerName="gather" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.489352 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb7ee6d-6062-455a-b0cd-39aa268fc029" containerName="gather" Nov 25 17:04:14 crc kubenswrapper[4743]: E1125 17:04:14.489376 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e84e92f3-1680-4cac-b59e-1b783e572d24" containerName="keystone-cron" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.489387 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84e92f3-1680-4cac-b59e-1b783e572d24" containerName="keystone-cron" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.489705 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e84e92f3-1680-4cac-b59e-1b783e572d24" containerName="keystone-cron" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.489728 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb7ee6d-6062-455a-b0cd-39aa268fc029" containerName="copy" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.489746 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb7ee6d-6062-455a-b0cd-39aa268fc029" containerName="gather" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.491401 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/must-gather-lzb6c" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.493790 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jt59v"/"default-dockercfg-gmv4z" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.495080 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jt59v"/"kube-root-ca.crt" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.495233 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jt59v"/"openshift-service-ca.crt" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.497180 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jt59v/must-gather-lzb6c"] Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.546982 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2339ebd2-9b75-4640-b625-8ce96fbfc1e5-must-gather-output\") pod \"must-gather-lzb6c\" (UID: \"2339ebd2-9b75-4640-b625-8ce96fbfc1e5\") " pod="openshift-must-gather-jt59v/must-gather-lzb6c" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.547051 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdlhx\" (UniqueName: \"kubernetes.io/projected/2339ebd2-9b75-4640-b625-8ce96fbfc1e5-kube-api-access-pdlhx\") pod \"must-gather-lzb6c\" (UID: \"2339ebd2-9b75-4640-b625-8ce96fbfc1e5\") " pod="openshift-must-gather-jt59v/must-gather-lzb6c" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.648640 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2339ebd2-9b75-4640-b625-8ce96fbfc1e5-must-gather-output\") pod \"must-gather-lzb6c\" (UID: \"2339ebd2-9b75-4640-b625-8ce96fbfc1e5\") " pod="openshift-must-gather-jt59v/must-gather-lzb6c" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.648699 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdlhx\" (UniqueName: \"kubernetes.io/projected/2339ebd2-9b75-4640-b625-8ce96fbfc1e5-kube-api-access-pdlhx\") pod \"must-gather-lzb6c\" (UID: \"2339ebd2-9b75-4640-b625-8ce96fbfc1e5\") " pod="openshift-must-gather-jt59v/must-gather-lzb6c" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.649078 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2339ebd2-9b75-4640-b625-8ce96fbfc1e5-must-gather-output\") pod \"must-gather-lzb6c\" (UID: \"2339ebd2-9b75-4640-b625-8ce96fbfc1e5\") " pod="openshift-must-gather-jt59v/must-gather-lzb6c" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.695347 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdlhx\" (UniqueName: \"kubernetes.io/projected/2339ebd2-9b75-4640-b625-8ce96fbfc1e5-kube-api-access-pdlhx\") pod \"must-gather-lzb6c\" (UID: \"2339ebd2-9b75-4640-b625-8ce96fbfc1e5\") " pod="openshift-must-gather-jt59v/must-gather-lzb6c" Nov 25 17:04:14 crc kubenswrapper[4743]: I1125 17:04:14.813028 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/must-gather-lzb6c" Nov 25 17:04:15 crc kubenswrapper[4743]: I1125 17:04:15.287837 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jt59v/must-gather-lzb6c"] Nov 25 17:04:15 crc kubenswrapper[4743]: I1125 17:04:15.759612 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt59v/must-gather-lzb6c" event={"ID":"2339ebd2-9b75-4640-b625-8ce96fbfc1e5","Type":"ContainerStarted","Data":"bb131daf59033c4f22a31c934aac6e49dc0cbd669044c4169987a9d8f66b6027"} Nov 25 17:04:15 crc kubenswrapper[4743]: I1125 17:04:15.760097 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt59v/must-gather-lzb6c" event={"ID":"2339ebd2-9b75-4640-b625-8ce96fbfc1e5","Type":"ContainerStarted","Data":"229147b156a9da2b6980324974b28092bbc9120846bd6ff8691b8914747e2d3d"} Nov 25 17:04:16 crc kubenswrapper[4743]: I1125 17:04:16.773453 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt59v/must-gather-lzb6c" event={"ID":"2339ebd2-9b75-4640-b625-8ce96fbfc1e5","Type":"ContainerStarted","Data":"303d2387ef7f2b3aee8d643b7e016001e50e5fe70377c00c59505c5e4b0170ca"} Nov 25 17:04:16 crc kubenswrapper[4743]: I1125 17:04:16.804096 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jt59v/must-gather-lzb6c" podStartSLOduration=2.80406711 podStartE2EDuration="2.80406711s" podCreationTimestamp="2025-11-25 17:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 17:04:16.79229601 +0000 UTC m=+3935.914135559" watchObservedRunningTime="2025-11-25 17:04:16.80406711 +0000 UTC m=+3935.925906689" Nov 25 17:04:19 crc kubenswrapper[4743]: I1125 17:04:19.113751 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jt59v/crc-debug-d2b6v"] Nov 25 17:04:19 crc kubenswrapper[4743]: I1125 17:04:19.115240 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/crc-debug-d2b6v" Nov 25 17:04:19 crc kubenswrapper[4743]: I1125 17:04:19.241573 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qz97\" (UniqueName: \"kubernetes.io/projected/9e22db5f-a398-4184-adbe-1d64ea16c2d0-kube-api-access-2qz97\") pod \"crc-debug-d2b6v\" (UID: \"9e22db5f-a398-4184-adbe-1d64ea16c2d0\") " pod="openshift-must-gather-jt59v/crc-debug-d2b6v" Nov 25 17:04:19 crc kubenswrapper[4743]: I1125 17:04:19.242088 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e22db5f-a398-4184-adbe-1d64ea16c2d0-host\") pod \"crc-debug-d2b6v\" (UID: \"9e22db5f-a398-4184-adbe-1d64ea16c2d0\") " pod="openshift-must-gather-jt59v/crc-debug-d2b6v" Nov 25 17:04:19 crc kubenswrapper[4743]: I1125 17:04:19.345180 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qz97\" (UniqueName: \"kubernetes.io/projected/9e22db5f-a398-4184-adbe-1d64ea16c2d0-kube-api-access-2qz97\") pod \"crc-debug-d2b6v\" (UID: \"9e22db5f-a398-4184-adbe-1d64ea16c2d0\") " pod="openshift-must-gather-jt59v/crc-debug-d2b6v" Nov 25 17:04:19 crc kubenswrapper[4743]: I1125 17:04:19.345902 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e22db5f-a398-4184-adbe-1d64ea16c2d0-host\") pod \"crc-debug-d2b6v\" (UID: \"9e22db5f-a398-4184-adbe-1d64ea16c2d0\") " pod="openshift-must-gather-jt59v/crc-debug-d2b6v" Nov 25 17:04:19 crc kubenswrapper[4743]: I1125 17:04:19.346096 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e22db5f-a398-4184-adbe-1d64ea16c2d0-host\") pod \"crc-debug-d2b6v\" (UID: \"9e22db5f-a398-4184-adbe-1d64ea16c2d0\") " pod="openshift-must-gather-jt59v/crc-debug-d2b6v" Nov 25 17:04:19 crc kubenswrapper[4743]: I1125 17:04:19.373644 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qz97\" (UniqueName: \"kubernetes.io/projected/9e22db5f-a398-4184-adbe-1d64ea16c2d0-kube-api-access-2qz97\") pod \"crc-debug-d2b6v\" (UID: \"9e22db5f-a398-4184-adbe-1d64ea16c2d0\") " pod="openshift-must-gather-jt59v/crc-debug-d2b6v" Nov 25 17:04:19 crc kubenswrapper[4743]: I1125 17:04:19.438982 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/crc-debug-d2b6v" Nov 25 17:04:19 crc kubenswrapper[4743]: W1125 17:04:19.476097 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e22db5f_a398_4184_adbe_1d64ea16c2d0.slice/crio-8f820786bef3b88a09ab91737f270a6c4d82499068e6f20481df2eb09164e401 WatchSource:0}: Error finding container 8f820786bef3b88a09ab91737f270a6c4d82499068e6f20481df2eb09164e401: Status 404 returned error can't find the container with id 8f820786bef3b88a09ab91737f270a6c4d82499068e6f20481df2eb09164e401 Nov 25 17:04:19 crc kubenswrapper[4743]: I1125 17:04:19.803736 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt59v/crc-debug-d2b6v" event={"ID":"9e22db5f-a398-4184-adbe-1d64ea16c2d0","Type":"ContainerStarted","Data":"8f820786bef3b88a09ab91737f270a6c4d82499068e6f20481df2eb09164e401"} Nov 25 17:04:20 crc kubenswrapper[4743]: I1125 17:04:20.816111 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt59v/crc-debug-d2b6v" event={"ID":"9e22db5f-a398-4184-adbe-1d64ea16c2d0","Type":"ContainerStarted","Data":"2ca770be6a947735ceec602d9229936267985ded6f56bbc476a5b324e28d2846"} Nov 25 17:04:20 crc kubenswrapper[4743]: I1125 17:04:20.830958 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jt59v/crc-debug-d2b6v" podStartSLOduration=1.830934902 podStartE2EDuration="1.830934902s" podCreationTimestamp="2025-11-25 17:04:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 17:04:20.829175677 +0000 UTC m=+3939.951015226" watchObservedRunningTime="2025-11-25 17:04:20.830934902 +0000 UTC m=+3939.952774471" Nov 25 17:04:22 crc kubenswrapper[4743]: I1125 17:04:22.776078 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:04:23 crc kubenswrapper[4743]: I1125 17:04:23.868064 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"df21c8ad78bde4aea979a77528322442108122601a0badf468fa0875586a9ad5"} Nov 25 17:04:59 crc kubenswrapper[4743]: I1125 17:04:59.193411 4743 generic.go:334] "Generic (PLEG): container finished" podID="9e22db5f-a398-4184-adbe-1d64ea16c2d0" containerID="2ca770be6a947735ceec602d9229936267985ded6f56bbc476a5b324e28d2846" exitCode=0 Nov 25 17:04:59 crc kubenswrapper[4743]: I1125 17:04:59.193488 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt59v/crc-debug-d2b6v" event={"ID":"9e22db5f-a398-4184-adbe-1d64ea16c2d0","Type":"ContainerDied","Data":"2ca770be6a947735ceec602d9229936267985ded6f56bbc476a5b324e28d2846"} Nov 25 17:05:00 crc kubenswrapper[4743]: I1125 17:05:00.307451 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/crc-debug-d2b6v" Nov 25 17:05:00 crc kubenswrapper[4743]: I1125 17:05:00.352277 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jt59v/crc-debug-d2b6v"] Nov 25 17:05:00 crc kubenswrapper[4743]: I1125 17:05:00.359278 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jt59v/crc-debug-d2b6v"] Nov 25 17:05:00 crc kubenswrapper[4743]: I1125 17:05:00.404415 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qz97\" (UniqueName: \"kubernetes.io/projected/9e22db5f-a398-4184-adbe-1d64ea16c2d0-kube-api-access-2qz97\") pod \"9e22db5f-a398-4184-adbe-1d64ea16c2d0\" (UID: \"9e22db5f-a398-4184-adbe-1d64ea16c2d0\") " Nov 25 17:05:00 crc kubenswrapper[4743]: I1125 17:05:00.404697 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e22db5f-a398-4184-adbe-1d64ea16c2d0-host\") pod \"9e22db5f-a398-4184-adbe-1d64ea16c2d0\" (UID: \"9e22db5f-a398-4184-adbe-1d64ea16c2d0\") " Nov 25 17:05:00 crc kubenswrapper[4743]: I1125 17:05:00.404824 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e22db5f-a398-4184-adbe-1d64ea16c2d0-host" (OuterVolumeSpecName: "host") pod "9e22db5f-a398-4184-adbe-1d64ea16c2d0" (UID: "9e22db5f-a398-4184-adbe-1d64ea16c2d0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 17:05:00 crc kubenswrapper[4743]: I1125 17:05:00.405296 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9e22db5f-a398-4184-adbe-1d64ea16c2d0-host\") on node \"crc\" DevicePath \"\"" Nov 25 17:05:00 crc kubenswrapper[4743]: I1125 17:05:00.418910 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e22db5f-a398-4184-adbe-1d64ea16c2d0-kube-api-access-2qz97" (OuterVolumeSpecName: "kube-api-access-2qz97") pod "9e22db5f-a398-4184-adbe-1d64ea16c2d0" (UID: "9e22db5f-a398-4184-adbe-1d64ea16c2d0"). InnerVolumeSpecName "kube-api-access-2qz97". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 17:05:00 crc kubenswrapper[4743]: I1125 17:05:00.506891 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qz97\" (UniqueName: \"kubernetes.io/projected/9e22db5f-a398-4184-adbe-1d64ea16c2d0-kube-api-access-2qz97\") on node \"crc\" DevicePath \"\"" Nov 25 17:05:01 crc kubenswrapper[4743]: I1125 17:05:01.218401 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f820786bef3b88a09ab91737f270a6c4d82499068e6f20481df2eb09164e401" Nov 25 17:05:01 crc kubenswrapper[4743]: I1125 17:05:01.219703 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/crc-debug-d2b6v" Nov 25 17:05:01 crc kubenswrapper[4743]: I1125 17:05:01.526191 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jt59v/crc-debug-wnfgz"] Nov 25 17:05:01 crc kubenswrapper[4743]: E1125 17:05:01.526736 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e22db5f-a398-4184-adbe-1d64ea16c2d0" containerName="container-00" Nov 25 17:05:01 crc kubenswrapper[4743]: I1125 17:05:01.526750 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e22db5f-a398-4184-adbe-1d64ea16c2d0" containerName="container-00" Nov 25 17:05:01 crc kubenswrapper[4743]: I1125 17:05:01.526949 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e22db5f-a398-4184-adbe-1d64ea16c2d0" containerName="container-00" Nov 25 17:05:01 crc kubenswrapper[4743]: I1125 17:05:01.527676 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/crc-debug-wnfgz" Nov 25 17:05:01 crc kubenswrapper[4743]: I1125 17:05:01.636427 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjmj9\" (UniqueName: \"kubernetes.io/projected/9196acd0-a646-4aef-a985-1bfb60713ad6-kube-api-access-bjmj9\") pod \"crc-debug-wnfgz\" (UID: \"9196acd0-a646-4aef-a985-1bfb60713ad6\") " pod="openshift-must-gather-jt59v/crc-debug-wnfgz" Nov 25 17:05:01 crc kubenswrapper[4743]: I1125 17:05:01.636635 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9196acd0-a646-4aef-a985-1bfb60713ad6-host\") pod \"crc-debug-wnfgz\" (UID: \"9196acd0-a646-4aef-a985-1bfb60713ad6\") " pod="openshift-must-gather-jt59v/crc-debug-wnfgz" Nov 25 17:05:01 crc kubenswrapper[4743]: I1125 17:05:01.738629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjmj9\" (UniqueName: \"kubernetes.io/projected/9196acd0-a646-4aef-a985-1bfb60713ad6-kube-api-access-bjmj9\") pod \"crc-debug-wnfgz\" (UID: \"9196acd0-a646-4aef-a985-1bfb60713ad6\") " pod="openshift-must-gather-jt59v/crc-debug-wnfgz" Nov 25 17:05:01 crc kubenswrapper[4743]: I1125 17:05:01.738690 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9196acd0-a646-4aef-a985-1bfb60713ad6-host\") pod \"crc-debug-wnfgz\" (UID: \"9196acd0-a646-4aef-a985-1bfb60713ad6\") " pod="openshift-must-gather-jt59v/crc-debug-wnfgz" Nov 25 17:05:01 crc kubenswrapper[4743]: I1125 17:05:01.738863 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9196acd0-a646-4aef-a985-1bfb60713ad6-host\") pod \"crc-debug-wnfgz\" (UID: \"9196acd0-a646-4aef-a985-1bfb60713ad6\") " pod="openshift-must-gather-jt59v/crc-debug-wnfgz" Nov 25 17:05:01 crc kubenswrapper[4743]: I1125 17:05:01.764167 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjmj9\" (UniqueName: \"kubernetes.io/projected/9196acd0-a646-4aef-a985-1bfb60713ad6-kube-api-access-bjmj9\") pod \"crc-debug-wnfgz\" (UID: \"9196acd0-a646-4aef-a985-1bfb60713ad6\") " pod="openshift-must-gather-jt59v/crc-debug-wnfgz" Nov 25 17:05:01 crc kubenswrapper[4743]: I1125 17:05:01.787782 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e22db5f-a398-4184-adbe-1d64ea16c2d0" path="/var/lib/kubelet/pods/9e22db5f-a398-4184-adbe-1d64ea16c2d0/volumes" Nov 25 17:05:01 crc kubenswrapper[4743]: I1125 17:05:01.847035 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/crc-debug-wnfgz" Nov 25 17:05:02 crc kubenswrapper[4743]: I1125 17:05:02.227739 4743 generic.go:334] "Generic (PLEG): container finished" podID="9196acd0-a646-4aef-a985-1bfb60713ad6" containerID="53eecf60edf7c4f20390890adc5853dd234a4bc33335c2770fdd910f9c22e62a" exitCode=0 Nov 25 17:05:02 crc kubenswrapper[4743]: I1125 17:05:02.227823 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt59v/crc-debug-wnfgz" event={"ID":"9196acd0-a646-4aef-a985-1bfb60713ad6","Type":"ContainerDied","Data":"53eecf60edf7c4f20390890adc5853dd234a4bc33335c2770fdd910f9c22e62a"} Nov 25 17:05:02 crc kubenswrapper[4743]: I1125 17:05:02.228085 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt59v/crc-debug-wnfgz" event={"ID":"9196acd0-a646-4aef-a985-1bfb60713ad6","Type":"ContainerStarted","Data":"5a1f4c200bfd29c8e465b56d49f472f1ccf62c7986b36c9b5fd88b97805e54cf"} Nov 25 17:05:02 crc kubenswrapper[4743]: I1125 17:05:02.736658 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jt59v/crc-debug-wnfgz"] Nov 25 17:05:02 crc kubenswrapper[4743]: I1125 17:05:02.744618 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jt59v/crc-debug-wnfgz"] Nov 25 17:05:03 crc kubenswrapper[4743]: I1125 17:05:03.337441 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/crc-debug-wnfgz" Nov 25 17:05:03 crc kubenswrapper[4743]: I1125 17:05:03.365889 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjmj9\" (UniqueName: \"kubernetes.io/projected/9196acd0-a646-4aef-a985-1bfb60713ad6-kube-api-access-bjmj9\") pod \"9196acd0-a646-4aef-a985-1bfb60713ad6\" (UID: \"9196acd0-a646-4aef-a985-1bfb60713ad6\") " Nov 25 17:05:03 crc kubenswrapper[4743]: I1125 17:05:03.366203 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9196acd0-a646-4aef-a985-1bfb60713ad6-host\") pod \"9196acd0-a646-4aef-a985-1bfb60713ad6\" (UID: \"9196acd0-a646-4aef-a985-1bfb60713ad6\") " Nov 25 17:05:03 crc kubenswrapper[4743]: I1125 17:05:03.366278 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9196acd0-a646-4aef-a985-1bfb60713ad6-host" (OuterVolumeSpecName: "host") pod "9196acd0-a646-4aef-a985-1bfb60713ad6" (UID: "9196acd0-a646-4aef-a985-1bfb60713ad6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 17:05:03 crc kubenswrapper[4743]: I1125 17:05:03.366943 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9196acd0-a646-4aef-a985-1bfb60713ad6-host\") on node \"crc\" DevicePath \"\"" Nov 25 17:05:03 crc kubenswrapper[4743]: I1125 17:05:03.371654 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9196acd0-a646-4aef-a985-1bfb60713ad6-kube-api-access-bjmj9" (OuterVolumeSpecName: "kube-api-access-bjmj9") pod "9196acd0-a646-4aef-a985-1bfb60713ad6" (UID: "9196acd0-a646-4aef-a985-1bfb60713ad6"). InnerVolumeSpecName "kube-api-access-bjmj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 17:05:03 crc kubenswrapper[4743]: I1125 17:05:03.469274 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjmj9\" (UniqueName: \"kubernetes.io/projected/9196acd0-a646-4aef-a985-1bfb60713ad6-kube-api-access-bjmj9\") on node \"crc\" DevicePath \"\"" Nov 25 17:05:03 crc kubenswrapper[4743]: I1125 17:05:03.788320 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9196acd0-a646-4aef-a985-1bfb60713ad6" path="/var/lib/kubelet/pods/9196acd0-a646-4aef-a985-1bfb60713ad6/volumes" Nov 25 17:05:03 crc kubenswrapper[4743]: I1125 17:05:03.949442 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jt59v/crc-debug-jfk4m"] Nov 25 17:05:03 crc kubenswrapper[4743]: E1125 17:05:03.949862 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9196acd0-a646-4aef-a985-1bfb60713ad6" containerName="container-00" Nov 25 17:05:03 crc kubenswrapper[4743]: I1125 17:05:03.949877 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9196acd0-a646-4aef-a985-1bfb60713ad6" containerName="container-00" Nov 25 17:05:03 crc kubenswrapper[4743]: I1125 17:05:03.950046 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9196acd0-a646-4aef-a985-1bfb60713ad6" containerName="container-00" Nov 25 17:05:03 crc kubenswrapper[4743]: I1125 17:05:03.950658 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/crc-debug-jfk4m" Nov 25 17:05:03 crc kubenswrapper[4743]: I1125 17:05:03.976831 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9f395e7-948c-4f4f-b6ad-436d77b4fcd6-host\") pod \"crc-debug-jfk4m\" (UID: \"e9f395e7-948c-4f4f-b6ad-436d77b4fcd6\") " pod="openshift-must-gather-jt59v/crc-debug-jfk4m" Nov 25 17:05:03 crc kubenswrapper[4743]: I1125 17:05:03.977041 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkv55\" (UniqueName: \"kubernetes.io/projected/e9f395e7-948c-4f4f-b6ad-436d77b4fcd6-kube-api-access-nkv55\") pod \"crc-debug-jfk4m\" (UID: \"e9f395e7-948c-4f4f-b6ad-436d77b4fcd6\") " pod="openshift-must-gather-jt59v/crc-debug-jfk4m" Nov 25 17:05:04 crc kubenswrapper[4743]: I1125 17:05:04.078554 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9f395e7-948c-4f4f-b6ad-436d77b4fcd6-host\") pod \"crc-debug-jfk4m\" (UID: \"e9f395e7-948c-4f4f-b6ad-436d77b4fcd6\") " pod="openshift-must-gather-jt59v/crc-debug-jfk4m" Nov 25 17:05:04 crc kubenswrapper[4743]: I1125 17:05:04.078673 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9f395e7-948c-4f4f-b6ad-436d77b4fcd6-host\") pod \"crc-debug-jfk4m\" (UID: \"e9f395e7-948c-4f4f-b6ad-436d77b4fcd6\") " pod="openshift-must-gather-jt59v/crc-debug-jfk4m" Nov 25 17:05:04 crc kubenswrapper[4743]: I1125 17:05:04.078729 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkv55\" (UniqueName: \"kubernetes.io/projected/e9f395e7-948c-4f4f-b6ad-436d77b4fcd6-kube-api-access-nkv55\") pod \"crc-debug-jfk4m\" (UID: \"e9f395e7-948c-4f4f-b6ad-436d77b4fcd6\") " pod="openshift-must-gather-jt59v/crc-debug-jfk4m" Nov 25 17:05:04 crc kubenswrapper[4743]: I1125 17:05:04.097848 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkv55\" (UniqueName: \"kubernetes.io/projected/e9f395e7-948c-4f4f-b6ad-436d77b4fcd6-kube-api-access-nkv55\") pod \"crc-debug-jfk4m\" (UID: \"e9f395e7-948c-4f4f-b6ad-436d77b4fcd6\") " pod="openshift-must-gather-jt59v/crc-debug-jfk4m" Nov 25 17:05:04 crc kubenswrapper[4743]: I1125 17:05:04.247529 4743 scope.go:117] "RemoveContainer" containerID="53eecf60edf7c4f20390890adc5853dd234a4bc33335c2770fdd910f9c22e62a" Nov 25 17:05:04 crc kubenswrapper[4743]: I1125 17:05:04.247636 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/crc-debug-wnfgz" Nov 25 17:05:04 crc kubenswrapper[4743]: I1125 17:05:04.268083 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/crc-debug-jfk4m" Nov 25 17:05:05 crc kubenswrapper[4743]: I1125 17:05:05.259228 4743 generic.go:334] "Generic (PLEG): container finished" podID="e9f395e7-948c-4f4f-b6ad-436d77b4fcd6" containerID="f24a4d113ae18eec0f430b380b92b67b33e4ed25b3efbc259494c4cc584fe8b8" exitCode=0 Nov 25 17:05:05 crc kubenswrapper[4743]: I1125 17:05:05.259309 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt59v/crc-debug-jfk4m" event={"ID":"e9f395e7-948c-4f4f-b6ad-436d77b4fcd6","Type":"ContainerDied","Data":"f24a4d113ae18eec0f430b380b92b67b33e4ed25b3efbc259494c4cc584fe8b8"} Nov 25 17:05:05 crc kubenswrapper[4743]: I1125 17:05:05.259899 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt59v/crc-debug-jfk4m" event={"ID":"e9f395e7-948c-4f4f-b6ad-436d77b4fcd6","Type":"ContainerStarted","Data":"b5247fad0a145bb617a54bca33d113fd759da387b6693eaa3f99726d0d18870f"} Nov 25 17:05:05 crc kubenswrapper[4743]: I1125 17:05:05.302721 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jt59v/crc-debug-jfk4m"] Nov 25 17:05:05 crc kubenswrapper[4743]: I1125 17:05:05.310611 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jt59v/crc-debug-jfk4m"] Nov 25 17:05:06 crc kubenswrapper[4743]: I1125 17:05:06.379413 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/crc-debug-jfk4m" Nov 25 17:05:06 crc kubenswrapper[4743]: I1125 17:05:06.442534 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkv55\" (UniqueName: \"kubernetes.io/projected/e9f395e7-948c-4f4f-b6ad-436d77b4fcd6-kube-api-access-nkv55\") pod \"e9f395e7-948c-4f4f-b6ad-436d77b4fcd6\" (UID: \"e9f395e7-948c-4f4f-b6ad-436d77b4fcd6\") " Nov 25 17:05:06 crc kubenswrapper[4743]: I1125 17:05:06.442654 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9f395e7-948c-4f4f-b6ad-436d77b4fcd6-host\") pod \"e9f395e7-948c-4f4f-b6ad-436d77b4fcd6\" (UID: \"e9f395e7-948c-4f4f-b6ad-436d77b4fcd6\") " Nov 25 17:05:06 crc kubenswrapper[4743]: I1125 17:05:06.442798 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9f395e7-948c-4f4f-b6ad-436d77b4fcd6-host" (OuterVolumeSpecName: "host") pod "e9f395e7-948c-4f4f-b6ad-436d77b4fcd6" (UID: "e9f395e7-948c-4f4f-b6ad-436d77b4fcd6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 17:05:06 crc kubenswrapper[4743]: I1125 17:05:06.443467 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9f395e7-948c-4f4f-b6ad-436d77b4fcd6-host\") on node \"crc\" DevicePath \"\"" Nov 25 17:05:06 crc kubenswrapper[4743]: I1125 17:05:06.449156 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9f395e7-948c-4f4f-b6ad-436d77b4fcd6-kube-api-access-nkv55" (OuterVolumeSpecName: "kube-api-access-nkv55") pod "e9f395e7-948c-4f4f-b6ad-436d77b4fcd6" (UID: "e9f395e7-948c-4f4f-b6ad-436d77b4fcd6"). InnerVolumeSpecName "kube-api-access-nkv55". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 17:05:06 crc kubenswrapper[4743]: I1125 17:05:06.545400 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkv55\" (UniqueName: \"kubernetes.io/projected/e9f395e7-948c-4f4f-b6ad-436d77b4fcd6-kube-api-access-nkv55\") on node \"crc\" DevicePath \"\"" Nov 25 17:05:07 crc kubenswrapper[4743]: I1125 17:05:07.281319 4743 scope.go:117] "RemoveContainer" containerID="f24a4d113ae18eec0f430b380b92b67b33e4ed25b3efbc259494c4cc584fe8b8" Nov 25 17:05:07 crc kubenswrapper[4743]: I1125 17:05:07.281353 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/crc-debug-jfk4m" Nov 25 17:05:07 crc kubenswrapper[4743]: I1125 17:05:07.785911 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9f395e7-948c-4f4f-b6ad-436d77b4fcd6" path="/var/lib/kubelet/pods/e9f395e7-948c-4f4f-b6ad-436d77b4fcd6/volumes" Nov 25 17:05:24 crc kubenswrapper[4743]: I1125 17:05:24.596549 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-85dc5d687d-qkdzh_f3984c1f-c5d2-4a6a-9058-4c272455dcd8/barbican-api/0.log" Nov 25 17:05:24 crc kubenswrapper[4743]: I1125 17:05:24.766838 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-74944f4b54-xg775_095f59f0-0093-4e6d-8aa3-0ddc0161b213/barbican-keystone-listener/0.log" Nov 25 17:05:24 crc kubenswrapper[4743]: I1125 17:05:24.790904 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-85dc5d687d-qkdzh_f3984c1f-c5d2-4a6a-9058-4c272455dcd8/barbican-api-log/0.log" Nov 25 17:05:24 crc kubenswrapper[4743]: I1125 17:05:24.861191 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-74944f4b54-xg775_095f59f0-0093-4e6d-8aa3-0ddc0161b213/barbican-keystone-listener-log/0.log" Nov 25 17:05:24 crc kubenswrapper[4743]: I1125 17:05:24.972448 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-c9cfd9b5-p7l4w_ef127ba1-444d-4f1c-937b-965c7ce47d1a/barbican-worker/0.log" Nov 25 17:05:25 crc kubenswrapper[4743]: I1125 17:05:25.019859 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-c9cfd9b5-p7l4w_ef127ba1-444d-4f1c-937b-965c7ce47d1a/barbican-worker-log/0.log" Nov 25 17:05:25 crc kubenswrapper[4743]: I1125 17:05:25.191052 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-cppqd_14bc3c31-f23e-4c67-a989-e85613bd5607/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:25 crc kubenswrapper[4743]: I1125 17:05:25.253810 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95304982-4885-4344-914e-1a4693b5eed1/ceilometer-central-agent/0.log" Nov 25 17:05:25 crc kubenswrapper[4743]: I1125 17:05:25.335017 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95304982-4885-4344-914e-1a4693b5eed1/ceilometer-notification-agent/0.log" Nov 25 17:05:25 crc kubenswrapper[4743]: I1125 17:05:25.387157 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95304982-4885-4344-914e-1a4693b5eed1/proxy-httpd/0.log" Nov 25 17:05:25 crc kubenswrapper[4743]: I1125 17:05:25.437502 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_95304982-4885-4344-914e-1a4693b5eed1/sg-core/0.log" Nov 25 17:05:25 crc kubenswrapper[4743]: I1125 17:05:25.586463 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f8e01616-0594-420e-9180-2c348780903a/cinder-api/0.log" Nov 25 17:05:25 crc kubenswrapper[4743]: I1125 17:05:25.594097 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f8e01616-0594-420e-9180-2c348780903a/cinder-api-log/0.log" Nov 25 17:05:25 crc kubenswrapper[4743]: I1125 17:05:25.741607 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0fd119f0-4e29-4050-baee-a0261c883787/cinder-scheduler/0.log" Nov 25 17:05:25 crc kubenswrapper[4743]: I1125 17:05:25.786073 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0fd119f0-4e29-4050-baee-a0261c883787/probe/0.log" Nov 25 17:05:25 crc kubenswrapper[4743]: I1125 17:05:25.858120 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-pzm62_cf95749b-9f3b-4df2-afaf-869ec45e1807/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:25 crc kubenswrapper[4743]: I1125 17:05:25.979773 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-m9pmt_78da403f-2a93-4d95-b6b6-e3f0d4cbc6d0/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:26 crc kubenswrapper[4743]: I1125 17:05:26.060131 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-96jwb_a587d785-9e96-41ef-95b8-a247f530e971/init/0.log" Nov 25 17:05:26 crc kubenswrapper[4743]: I1125 17:05:26.213890 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-96jwb_a587d785-9e96-41ef-95b8-a247f530e971/init/0.log" Nov 25 17:05:26 crc kubenswrapper[4743]: I1125 17:05:26.251829 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mqpzt_370e248d-8977-4a95-ac29-df64918b694b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:26 crc kubenswrapper[4743]: I1125 17:05:26.280837 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-96jwb_a587d785-9e96-41ef-95b8-a247f530e971/dnsmasq-dns/0.log" Nov 25 17:05:26 crc kubenswrapper[4743]: I1125 17:05:26.614821 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_791e2d3a-4b72-42dc-9df0-0a185817f347/glance-httpd/0.log" Nov 25 17:05:26 crc kubenswrapper[4743]: I1125 17:05:26.665313 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_791e2d3a-4b72-42dc-9df0-0a185817f347/glance-log/0.log" Nov 25 17:05:26 crc kubenswrapper[4743]: I1125 17:05:26.830815 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_07d575a4-6889-4bf6-ad82-4c7e756607d2/glance-httpd/0.log" Nov 25 17:05:26 crc kubenswrapper[4743]: I1125 17:05:26.866930 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_07d575a4-6889-4bf6-ad82-4c7e756607d2/glance-log/0.log" Nov 25 17:05:27 crc kubenswrapper[4743]: I1125 17:05:27.002556 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7495cddcb-ghpkx_1e54ceb1-969a-4172-9928-7e424dd38b5b/horizon/0.log" Nov 25 17:05:27 crc kubenswrapper[4743]: I1125 17:05:27.457452 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7495cddcb-ghpkx_1e54ceb1-969a-4172-9928-7e424dd38b5b/horizon-log/0.log" Nov 25 17:05:27 crc kubenswrapper[4743]: I1125 17:05:27.679552 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-qtx8v_b1c2dd10-3126-4c40-a55f-679ed3441056/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:27 crc kubenswrapper[4743]: I1125 17:05:27.735748 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-kf689_522738de-cb3a-424d-ae01-b73bd3bcd8c6/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:27 crc kubenswrapper[4743]: I1125 17:05:27.943229 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29401501-bfhng_e84e92f3-1680-4cac-b59e-1b783e572d24/keystone-cron/0.log" Nov 25 17:05:28 crc kubenswrapper[4743]: I1125 17:05:28.048160 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-848747fd7b-bljn8_0e5a8995-2691-4c7f-baee-bf9cdf1b2427/keystone-api/0.log" Nov 25 17:05:28 crc kubenswrapper[4743]: I1125 17:05:28.083863 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_aa4a8c5c-3c11-45e5-815e-bebe62e1b165/kube-state-metrics/0.log" Nov 25 17:05:28 crc kubenswrapper[4743]: I1125 17:05:28.222165 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-xnnv6_7568caf6-7fa3-429a-90f2-40cbd4dece9d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:28 crc kubenswrapper[4743]: I1125 17:05:28.488416 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cc5fc48dc-hkvc8_c8823220-9bb8-44a4-a4a6-00661d8e2fad/neutron-httpd/0.log" Nov 25 17:05:28 crc kubenswrapper[4743]: I1125 17:05:28.543261 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cc5fc48dc-hkvc8_c8823220-9bb8-44a4-a4a6-00661d8e2fad/neutron-api/0.log" Nov 25 17:05:28 crc kubenswrapper[4743]: I1125 17:05:28.685076 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-pzdxv_389b43ba-821f-48b6-b924-46ddda4e2d11/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:29 crc kubenswrapper[4743]: I1125 17:05:29.065745 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a5aaab81-18e9-41e2-8db4-00c4a09b7710/nova-api-log/0.log" Nov 25 17:05:29 crc kubenswrapper[4743]: I1125 17:05:29.166023 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_da59725d-9914-40d1-b70b-57df96de1db2/nova-cell0-conductor-conductor/0.log" Nov 25 17:05:29 crc kubenswrapper[4743]: I1125 17:05:29.411879 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a5aaab81-18e9-41e2-8db4-00c4a09b7710/nova-api-api/0.log" Nov 25 17:05:29 crc kubenswrapper[4743]: I1125 17:05:29.816703 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_66fd62a5-dbf6-4ff3-a910-1969f287da86/nova-cell1-conductor-conductor/0.log" Nov 25 17:05:29 crc kubenswrapper[4743]: I1125 17:05:29.879842 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d229e467-a473-44bf-9f13-73155f796874/nova-cell1-novncproxy-novncproxy/0.log" Nov 25 17:05:29 crc kubenswrapper[4743]: I1125 17:05:29.929375 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-wfdh2_a80ee7c3-2c23-4079-994f-b04e8a21516e/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:30 crc kubenswrapper[4743]: I1125 17:05:30.289333 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5b78218b-03ac-4dbb-89cf-58580f5367d3/nova-metadata-log/0.log" Nov 25 17:05:30 crc kubenswrapper[4743]: I1125 17:05:30.659197 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5f82a83d-3847-490f-b9dd-5dda26140b80/nova-scheduler-scheduler/0.log" Nov 25 17:05:30 crc kubenswrapper[4743]: I1125 17:05:30.660733 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d/mysql-bootstrap/0.log" Nov 25 17:05:30 crc kubenswrapper[4743]: I1125 17:05:30.809358 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d/mysql-bootstrap/0.log" Nov 25 17:05:30 crc kubenswrapper[4743]: I1125 17:05:30.876835 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e2aaa698-cbf5-42d7-bc7f-a3ae0e5ba86d/galera/0.log" Nov 25 17:05:31 crc kubenswrapper[4743]: I1125 17:05:31.041198 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e54e0104-81dc-49fc-9233-135bf00032be/mysql-bootstrap/0.log" Nov 25 17:05:31 crc kubenswrapper[4743]: I1125 17:05:31.266155 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e54e0104-81dc-49fc-9233-135bf00032be/mysql-bootstrap/0.log" Nov 25 17:05:31 crc kubenswrapper[4743]: I1125 17:05:31.279423 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e54e0104-81dc-49fc-9233-135bf00032be/galera/0.log" Nov 25 17:05:31 crc kubenswrapper[4743]: I1125 17:05:31.457026 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b65064de-e088-4c89-9767-db14019b6e44/openstackclient/0.log" Nov 25 17:05:31 crc kubenswrapper[4743]: I1125 17:05:31.493789 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-8dtsl_7750901a-7566-4d94-8cb5-5aff66e22116/ovn-controller/0.log" Nov 25 17:05:31 crc kubenswrapper[4743]: I1125 17:05:31.677523 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-znflv_5835f976-c6b4-4bd9-9893-70905ce30872/openstack-network-exporter/0.log" Nov 25 17:05:31 crc kubenswrapper[4743]: I1125 17:05:31.700915 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5b78218b-03ac-4dbb-89cf-58580f5367d3/nova-metadata-metadata/0.log" Nov 25 17:05:31 crc kubenswrapper[4743]: I1125 17:05:31.889161 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lmnwx_ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc/ovsdb-server-init/0.log" Nov 25 17:05:32 crc kubenswrapper[4743]: I1125 17:05:32.059249 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lmnwx_ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc/ovsdb-server/0.log" Nov 25 17:05:32 crc kubenswrapper[4743]: I1125 17:05:32.081904 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lmnwx_ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc/ovsdb-server-init/0.log" Nov 25 17:05:32 crc kubenswrapper[4743]: I1125 17:05:32.134835 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lmnwx_ddf6aadd-2938-4c6e-ad9d-aa4459f6ebbc/ovs-vswitchd/0.log" Nov 25 17:05:32 crc kubenswrapper[4743]: I1125 17:05:32.353063 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-75jlm_2073fba4-3e3f-4c49-ae69-265ffbc47f68/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:32 crc kubenswrapper[4743]: I1125 17:05:32.361683 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eb743ab5-16ea-4be4-95ee-00a87767602e/openstack-network-exporter/0.log" Nov 25 17:05:32 crc kubenswrapper[4743]: I1125 17:05:32.435147 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eb743ab5-16ea-4be4-95ee-00a87767602e/ovn-northd/0.log" Nov 25 17:05:32 crc kubenswrapper[4743]: I1125 17:05:32.601088 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1/openstack-network-exporter/0.log" Nov 25 17:05:32 crc kubenswrapper[4743]: I1125 17:05:32.648059 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3ea5ee8a-d85d-40b3-ad4c-b89f29a7fdf1/ovsdbserver-nb/0.log" Nov 25 17:05:32 crc kubenswrapper[4743]: I1125 17:05:32.804418 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1c6f500e-afe2-4505-8a75-d68f109b80dc/openstack-network-exporter/0.log" Nov 25 17:05:32 crc kubenswrapper[4743]: I1125 17:05:32.806665 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1c6f500e-afe2-4505-8a75-d68f109b80dc/ovsdbserver-sb/0.log" Nov 25 17:05:32 crc kubenswrapper[4743]: I1125 17:05:32.949851 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6dd8557654-lgr92_659f8a21-e29e-47be-903b-742de8ec9b22/placement-api/0.log" Nov 25 17:05:33 crc kubenswrapper[4743]: I1125 17:05:33.163281 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f54afd9a-9279-4fd3-a14a-6742d1ad9d96/setup-container/0.log" Nov 25 17:05:33 crc kubenswrapper[4743]: I1125 17:05:33.172971 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6dd8557654-lgr92_659f8a21-e29e-47be-903b-742de8ec9b22/placement-log/0.log" Nov 25 17:05:33 crc kubenswrapper[4743]: I1125 17:05:33.328055 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1337639a-d66d-43cb-a7d9-487f22d1d804/setup-container/0.log" Nov 25 17:05:33 crc kubenswrapper[4743]: I1125 17:05:33.376785 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f54afd9a-9279-4fd3-a14a-6742d1ad9d96/setup-container/0.log" Nov 25 17:05:33 crc kubenswrapper[4743]: I1125 17:05:33.389402 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f54afd9a-9279-4fd3-a14a-6742d1ad9d96/rabbitmq/0.log" Nov 25 17:05:33 crc kubenswrapper[4743]: I1125 17:05:33.818860 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1337639a-d66d-43cb-a7d9-487f22d1d804/setup-container/0.log" Nov 25 17:05:33 crc kubenswrapper[4743]: I1125 17:05:33.892803 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6nngt_0d82431d-8bd6-4d1a-850d-d8c543994421/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:33 crc kubenswrapper[4743]: I1125 17:05:33.923016 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1337639a-d66d-43cb-a7d9-487f22d1d804/rabbitmq/0.log" Nov 25 17:05:34 crc kubenswrapper[4743]: I1125 17:05:34.028794 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-fdkdh_36ce5802-7073-425c-bd4e-1b770cfacd49/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:34 crc kubenswrapper[4743]: I1125 17:05:34.139130 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-whzh6_4ae17f2e-689f-4dd3-bc91-c52a218a8492/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:34 crc kubenswrapper[4743]: I1125 17:05:34.305646 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gqf46_a2aeec84-22a9-4f07-a1e2-12e61f62f09c/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:34 crc kubenswrapper[4743]: I1125 17:05:34.420684 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-vd9jg_3738fd51-cc5f-4837-8a29-3dc3d3bbcfd5/ssh-known-hosts-edpm-deployment/0.log" Nov 25 17:05:34 crc kubenswrapper[4743]: I1125 17:05:34.644531 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c64568bc5-svsgq_fd562da8-2d36-4517-8d73-237580575e98/proxy-server/0.log" Nov 25 17:05:34 crc kubenswrapper[4743]: I1125 17:05:34.715360 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c64568bc5-svsgq_fd562da8-2d36-4517-8d73-237580575e98/proxy-httpd/0.log" Nov 25 17:05:34 crc kubenswrapper[4743]: I1125 17:05:34.782089 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-m926g_f5eff179-2afc-4ec2-addc-31c3c36a6fd7/swift-ring-rebalance/0.log" Nov 25 17:05:34 crc kubenswrapper[4743]: I1125 17:05:34.894488 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/account-auditor/0.log" Nov 25 17:05:34 crc kubenswrapper[4743]: I1125 17:05:34.972949 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/account-reaper/0.log" Nov 25 17:05:35 crc kubenswrapper[4743]: I1125 17:05:35.025691 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/account-replicator/0.log" Nov 25 17:05:35 crc kubenswrapper[4743]: I1125 17:05:35.123399 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/container-auditor/0.log" Nov 25 17:05:35 crc kubenswrapper[4743]: I1125 17:05:35.141260 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/account-server/0.log" Nov 25 17:05:35 crc kubenswrapper[4743]: I1125 17:05:35.178131 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/container-replicator/0.log" Nov 25 17:05:35 crc kubenswrapper[4743]: I1125 17:05:35.267020 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/container-server/0.log" Nov 25 17:05:35 crc kubenswrapper[4743]: I1125 17:05:35.365555 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/container-updater/0.log" Nov 25 17:05:35 crc kubenswrapper[4743]: I1125 17:05:35.379210 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/object-auditor/0.log" Nov 25 17:05:35 crc kubenswrapper[4743]: I1125 17:05:35.408016 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/object-expirer/0.log" Nov 25 17:05:35 crc kubenswrapper[4743]: I1125 17:05:35.513328 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/object-replicator/0.log" Nov 25 17:05:35 crc kubenswrapper[4743]: I1125 17:05:35.598882 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/object-server/0.log" Nov 25 17:05:35 crc kubenswrapper[4743]: I1125 17:05:35.600173 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/object-updater/0.log" Nov 25 17:05:35 crc kubenswrapper[4743]: I1125 17:05:35.617964 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/rsync/0.log" Nov 25 17:05:35 crc kubenswrapper[4743]: I1125 17:05:35.780197 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9ae66928-3c05-4597-98a6-f663e9df7cff/swift-recon-cron/0.log" Nov 25 17:05:35 crc kubenswrapper[4743]: I1125 17:05:35.828260 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-ks9ql_4dd9e80f-8e99-46a6-b669-b2ec10285463/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:36 crc kubenswrapper[4743]: I1125 17:05:36.000353 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_47459f25-57d0-4c84-8f42-81c8698769bd/tempest-tests-tempest-tests-runner/0.log" Nov 25 17:05:36 crc kubenswrapper[4743]: I1125 17:05:36.004708 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_4e216387-c508-4f98-adff-3b4a3e97003e/test-operator-logs-container/0.log" Nov 25 17:05:36 crc kubenswrapper[4743]: I1125 17:05:36.202375 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-x4rkr_865996cb-146d-428e-aff6-7ce31c808ffe/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.092855 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dznng"] Nov 25 17:05:38 crc kubenswrapper[4743]: E1125 17:05:38.093630 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f395e7-948c-4f4f-b6ad-436d77b4fcd6" containerName="container-00" Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.093641 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f395e7-948c-4f4f-b6ad-436d77b4fcd6" containerName="container-00" Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.093824 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9f395e7-948c-4f4f-b6ad-436d77b4fcd6" containerName="container-00" Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.095196 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.105919 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dznng"] Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.294025 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94445ed5-4922-4db4-87a6-0290fc0ce4d6-catalog-content\") pod \"certified-operators-dznng\" (UID: \"94445ed5-4922-4db4-87a6-0290fc0ce4d6\") " pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.294117 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94445ed5-4922-4db4-87a6-0290fc0ce4d6-utilities\") pod \"certified-operators-dznng\" (UID: \"94445ed5-4922-4db4-87a6-0290fc0ce4d6\") " pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.294176 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2ckn\" (UniqueName: \"kubernetes.io/projected/94445ed5-4922-4db4-87a6-0290fc0ce4d6-kube-api-access-h2ckn\") pod \"certified-operators-dznng\" (UID: \"94445ed5-4922-4db4-87a6-0290fc0ce4d6\") " pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.395386 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94445ed5-4922-4db4-87a6-0290fc0ce4d6-catalog-content\") pod \"certified-operators-dznng\" (UID: \"94445ed5-4922-4db4-87a6-0290fc0ce4d6\") " pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.395477 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94445ed5-4922-4db4-87a6-0290fc0ce4d6-utilities\") pod \"certified-operators-dznng\" (UID: \"94445ed5-4922-4db4-87a6-0290fc0ce4d6\") " pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.395529 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2ckn\" (UniqueName: \"kubernetes.io/projected/94445ed5-4922-4db4-87a6-0290fc0ce4d6-kube-api-access-h2ckn\") pod \"certified-operators-dznng\" (UID: \"94445ed5-4922-4db4-87a6-0290fc0ce4d6\") " pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.395999 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94445ed5-4922-4db4-87a6-0290fc0ce4d6-catalog-content\") pod \"certified-operators-dznng\" (UID: \"94445ed5-4922-4db4-87a6-0290fc0ce4d6\") " pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.396023 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94445ed5-4922-4db4-87a6-0290fc0ce4d6-utilities\") pod \"certified-operators-dznng\" (UID: \"94445ed5-4922-4db4-87a6-0290fc0ce4d6\") " pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.418349 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2ckn\" (UniqueName: \"kubernetes.io/projected/94445ed5-4922-4db4-87a6-0290fc0ce4d6-kube-api-access-h2ckn\") pod \"certified-operators-dznng\" (UID: \"94445ed5-4922-4db4-87a6-0290fc0ce4d6\") " pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.422578 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:38 crc kubenswrapper[4743]: I1125 17:05:38.943252 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dznng"] Nov 25 17:05:39 crc kubenswrapper[4743]: I1125 17:05:39.555036 4743 generic.go:334] "Generic (PLEG): container finished" podID="94445ed5-4922-4db4-87a6-0290fc0ce4d6" containerID="ad2c70b0d80b7060857d20884f73086e43e9274fd19ddb88375c32756b33fc4e" exitCode=0 Nov 25 17:05:39 crc kubenswrapper[4743]: I1125 17:05:39.555089 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dznng" event={"ID":"94445ed5-4922-4db4-87a6-0290fc0ce4d6","Type":"ContainerDied","Data":"ad2c70b0d80b7060857d20884f73086e43e9274fd19ddb88375c32756b33fc4e"} Nov 25 17:05:39 crc kubenswrapper[4743]: I1125 17:05:39.555627 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dznng" event={"ID":"94445ed5-4922-4db4-87a6-0290fc0ce4d6","Type":"ContainerStarted","Data":"52f7f6321b83f72f1ad648fbd7e258b1f4ea2f4b325a290248c5e7115d7615c9"} Nov 25 17:05:39 crc kubenswrapper[4743]: I1125 17:05:39.557285 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 17:05:40 crc kubenswrapper[4743]: I1125 17:05:40.571842 4743 generic.go:334] "Generic (PLEG): container finished" podID="94445ed5-4922-4db4-87a6-0290fc0ce4d6" containerID="145d1f60cc0d5e6154494179fb017e5cc69e236e32237af6b9d33855f9897f07" exitCode=0 Nov 25 17:05:40 crc kubenswrapper[4743]: I1125 17:05:40.571891 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dznng" event={"ID":"94445ed5-4922-4db4-87a6-0290fc0ce4d6","Type":"ContainerDied","Data":"145d1f60cc0d5e6154494179fb017e5cc69e236e32237af6b9d33855f9897f07"} Nov 25 17:05:41 crc kubenswrapper[4743]: I1125 17:05:41.588527 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dznng" event={"ID":"94445ed5-4922-4db4-87a6-0290fc0ce4d6","Type":"ContainerStarted","Data":"105a6bd5ddabea2e40042d32506da3b1c98def463f98a692051f06ae0c355721"} Nov 25 17:05:41 crc kubenswrapper[4743]: I1125 17:05:41.611977 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dznng" podStartSLOduration=1.918321638 podStartE2EDuration="3.611914198s" podCreationTimestamp="2025-11-25 17:05:38 +0000 UTC" firstStartedPulling="2025-11-25 17:05:39.557081897 +0000 UTC m=+4018.678921446" lastFinishedPulling="2025-11-25 17:05:41.250674457 +0000 UTC m=+4020.372514006" observedRunningTime="2025-11-25 17:05:41.611846316 +0000 UTC m=+4020.733685865" watchObservedRunningTime="2025-11-25 17:05:41.611914198 +0000 UTC m=+4020.733753747" Nov 25 17:05:48 crc kubenswrapper[4743]: I1125 17:05:48.423310 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:48 crc kubenswrapper[4743]: I1125 17:05:48.424186 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:48 crc kubenswrapper[4743]: I1125 17:05:48.954022 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:48 crc kubenswrapper[4743]: I1125 17:05:48.970301 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_30545138-1305-45e8-9225-386065312213/memcached/0.log" Nov 25 17:05:49 crc kubenswrapper[4743]: I1125 17:05:49.004485 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:49 crc kubenswrapper[4743]: I1125 17:05:49.192573 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dznng"] Nov 25 17:05:50 crc kubenswrapper[4743]: I1125 17:05:50.673117 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dznng" podUID="94445ed5-4922-4db4-87a6-0290fc0ce4d6" containerName="registry-server" containerID="cri-o://105a6bd5ddabea2e40042d32506da3b1c98def463f98a692051f06ae0c355721" gracePeriod=2 Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.307830 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.424629 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94445ed5-4922-4db4-87a6-0290fc0ce4d6-catalog-content\") pod \"94445ed5-4922-4db4-87a6-0290fc0ce4d6\" (UID: \"94445ed5-4922-4db4-87a6-0290fc0ce4d6\") " Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.424715 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2ckn\" (UniqueName: \"kubernetes.io/projected/94445ed5-4922-4db4-87a6-0290fc0ce4d6-kube-api-access-h2ckn\") pod \"94445ed5-4922-4db4-87a6-0290fc0ce4d6\" (UID: \"94445ed5-4922-4db4-87a6-0290fc0ce4d6\") " Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.424874 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94445ed5-4922-4db4-87a6-0290fc0ce4d6-utilities\") pod \"94445ed5-4922-4db4-87a6-0290fc0ce4d6\" (UID: \"94445ed5-4922-4db4-87a6-0290fc0ce4d6\") " Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.425717 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94445ed5-4922-4db4-87a6-0290fc0ce4d6-utilities" (OuterVolumeSpecName: "utilities") pod "94445ed5-4922-4db4-87a6-0290fc0ce4d6" (UID: "94445ed5-4922-4db4-87a6-0290fc0ce4d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.436753 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94445ed5-4922-4db4-87a6-0290fc0ce4d6-kube-api-access-h2ckn" (OuterVolumeSpecName: "kube-api-access-h2ckn") pod "94445ed5-4922-4db4-87a6-0290fc0ce4d6" (UID: "94445ed5-4922-4db4-87a6-0290fc0ce4d6"). InnerVolumeSpecName "kube-api-access-h2ckn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.484237 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94445ed5-4922-4db4-87a6-0290fc0ce4d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94445ed5-4922-4db4-87a6-0290fc0ce4d6" (UID: "94445ed5-4922-4db4-87a6-0290fc0ce4d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.526895 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94445ed5-4922-4db4-87a6-0290fc0ce4d6-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.526932 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94445ed5-4922-4db4-87a6-0290fc0ce4d6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.526942 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2ckn\" (UniqueName: \"kubernetes.io/projected/94445ed5-4922-4db4-87a6-0290fc0ce4d6-kube-api-access-h2ckn\") on node \"crc\" DevicePath \"\"" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.684483 4743 generic.go:334] "Generic (PLEG): container finished" podID="94445ed5-4922-4db4-87a6-0290fc0ce4d6" containerID="105a6bd5ddabea2e40042d32506da3b1c98def463f98a692051f06ae0c355721" exitCode=0 Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.684533 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dznng" event={"ID":"94445ed5-4922-4db4-87a6-0290fc0ce4d6","Type":"ContainerDied","Data":"105a6bd5ddabea2e40042d32506da3b1c98def463f98a692051f06ae0c355721"} Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.684562 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dznng" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.684610 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dznng" event={"ID":"94445ed5-4922-4db4-87a6-0290fc0ce4d6","Type":"ContainerDied","Data":"52f7f6321b83f72f1ad648fbd7e258b1f4ea2f4b325a290248c5e7115d7615c9"} Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.684635 4743 scope.go:117] "RemoveContainer" containerID="105a6bd5ddabea2e40042d32506da3b1c98def463f98a692051f06ae0c355721" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.707718 4743 scope.go:117] "RemoveContainer" containerID="145d1f60cc0d5e6154494179fb017e5cc69e236e32237af6b9d33855f9897f07" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.719123 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dznng"] Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.728257 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dznng"] Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.750256 4743 scope.go:117] "RemoveContainer" containerID="ad2c70b0d80b7060857d20884f73086e43e9274fd19ddb88375c32756b33fc4e" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.782905 4743 scope.go:117] "RemoveContainer" containerID="105a6bd5ddabea2e40042d32506da3b1c98def463f98a692051f06ae0c355721" Nov 25 17:05:51 crc kubenswrapper[4743]: E1125 17:05:51.784939 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"105a6bd5ddabea2e40042d32506da3b1c98def463f98a692051f06ae0c355721\": container with ID starting with 105a6bd5ddabea2e40042d32506da3b1c98def463f98a692051f06ae0c355721 not found: ID does not exist" containerID="105a6bd5ddabea2e40042d32506da3b1c98def463f98a692051f06ae0c355721" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.785058 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"105a6bd5ddabea2e40042d32506da3b1c98def463f98a692051f06ae0c355721"} err="failed to get container status \"105a6bd5ddabea2e40042d32506da3b1c98def463f98a692051f06ae0c355721\": rpc error: code = NotFound desc = could not find container \"105a6bd5ddabea2e40042d32506da3b1c98def463f98a692051f06ae0c355721\": container with ID starting with 105a6bd5ddabea2e40042d32506da3b1c98def463f98a692051f06ae0c355721 not found: ID does not exist" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.785136 4743 scope.go:117] "RemoveContainer" containerID="145d1f60cc0d5e6154494179fb017e5cc69e236e32237af6b9d33855f9897f07" Nov 25 17:05:51 crc kubenswrapper[4743]: E1125 17:05:51.786013 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"145d1f60cc0d5e6154494179fb017e5cc69e236e32237af6b9d33855f9897f07\": container with ID starting with 145d1f60cc0d5e6154494179fb017e5cc69e236e32237af6b9d33855f9897f07 not found: ID does not exist" containerID="145d1f60cc0d5e6154494179fb017e5cc69e236e32237af6b9d33855f9897f07" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.786121 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"145d1f60cc0d5e6154494179fb017e5cc69e236e32237af6b9d33855f9897f07"} err="failed to get container status \"145d1f60cc0d5e6154494179fb017e5cc69e236e32237af6b9d33855f9897f07\": rpc error: code = NotFound desc = could not find container \"145d1f60cc0d5e6154494179fb017e5cc69e236e32237af6b9d33855f9897f07\": container with ID starting with 145d1f60cc0d5e6154494179fb017e5cc69e236e32237af6b9d33855f9897f07 not found: ID does not exist" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.786187 4743 scope.go:117] "RemoveContainer" containerID="ad2c70b0d80b7060857d20884f73086e43e9274fd19ddb88375c32756b33fc4e" Nov 25 17:05:51 crc kubenswrapper[4743]: E1125 17:05:51.786520 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad2c70b0d80b7060857d20884f73086e43e9274fd19ddb88375c32756b33fc4e\": container with ID starting with ad2c70b0d80b7060857d20884f73086e43e9274fd19ddb88375c32756b33fc4e not found: ID does not exist" containerID="ad2c70b0d80b7060857d20884f73086e43e9274fd19ddb88375c32756b33fc4e" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.786734 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad2c70b0d80b7060857d20884f73086e43e9274fd19ddb88375c32756b33fc4e"} err="failed to get container status \"ad2c70b0d80b7060857d20884f73086e43e9274fd19ddb88375c32756b33fc4e\": rpc error: code = NotFound desc = could not find container \"ad2c70b0d80b7060857d20884f73086e43e9274fd19ddb88375c32756b33fc4e\": container with ID starting with ad2c70b0d80b7060857d20884f73086e43e9274fd19ddb88375c32756b33fc4e not found: ID does not exist" Nov 25 17:05:51 crc kubenswrapper[4743]: I1125 17:05:51.798991 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94445ed5-4922-4db4-87a6-0290fc0ce4d6" path="/var/lib/kubelet/pods/94445ed5-4922-4db4-87a6-0290fc0ce4d6/volumes" Nov 25 17:06:02 crc kubenswrapper[4743]: I1125 17:06:02.747357 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd_ab54bea3-befb-4d86-a499-806f480df7b0/util/0.log" Nov 25 17:06:02 crc kubenswrapper[4743]: I1125 17:06:02.929068 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd_ab54bea3-befb-4d86-a499-806f480df7b0/pull/0.log" Nov 25 17:06:02 crc kubenswrapper[4743]: I1125 17:06:02.934065 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd_ab54bea3-befb-4d86-a499-806f480df7b0/util/0.log" Nov 25 17:06:02 crc kubenswrapper[4743]: I1125 17:06:02.969340 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd_ab54bea3-befb-4d86-a499-806f480df7b0/pull/0.log" Nov 25 17:06:03 crc kubenswrapper[4743]: I1125 17:06:03.121695 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd_ab54bea3-befb-4d86-a499-806f480df7b0/util/0.log" Nov 25 17:06:03 crc kubenswrapper[4743]: I1125 17:06:03.127092 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd_ab54bea3-befb-4d86-a499-806f480df7b0/pull/0.log" Nov 25 17:06:03 crc kubenswrapper[4743]: I1125 17:06:03.169395 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_02b51373bfd38b97438fe30046ee5f996481346af7e5a5158c07c13aa8st2rd_ab54bea3-befb-4d86-a499-806f480df7b0/extract/0.log" Nov 25 17:06:03 crc kubenswrapper[4743]: I1125 17:06:03.280135 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-82tqm_9b77ce44-3830-488e-ac40-97af4d969f6e/kube-rbac-proxy/0.log" Nov 25 17:06:03 crc kubenswrapper[4743]: I1125 17:06:03.363490 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-wk72z_0729dc1e-3e2c-410e-892d-ef4773882665/kube-rbac-proxy/0.log" Nov 25 17:06:03 crc kubenswrapper[4743]: I1125 17:06:03.419308 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-82tqm_9b77ce44-3830-488e-ac40-97af4d969f6e/manager/0.log" Nov 25 17:06:03 crc kubenswrapper[4743]: I1125 17:06:03.478557 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-wk72z_0729dc1e-3e2c-410e-892d-ef4773882665/manager/0.log" Nov 25 17:06:03 crc kubenswrapper[4743]: I1125 17:06:03.566334 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-fwlrd_6a470e3c-9cac-463b-a253-308f3c386725/kube-rbac-proxy/0.log" Nov 25 17:06:03 crc kubenswrapper[4743]: I1125 17:06:03.632322 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-fwlrd_6a470e3c-9cac-463b-a253-308f3c386725/manager/0.log" Nov 25 17:06:03 crc kubenswrapper[4743]: I1125 17:06:03.707324 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-q8p2b_288e97c2-c236-4177-9a52-bcf1c6c69faa/kube-rbac-proxy/0.log" Nov 25 17:06:03 crc kubenswrapper[4743]: I1125 17:06:03.815018 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-q8p2b_288e97c2-c236-4177-9a52-bcf1c6c69faa/manager/0.log" Nov 25 17:06:03 crc kubenswrapper[4743]: I1125 17:06:03.867241 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-prmnw_29690625-5e1d-417a-b0e5-9d74645b31f7/kube-rbac-proxy/0.log" Nov 25 17:06:03 crc kubenswrapper[4743]: I1125 17:06:03.933793 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-prmnw_29690625-5e1d-417a-b0e5-9d74645b31f7/manager/0.log" Nov 25 17:06:04 crc kubenswrapper[4743]: I1125 17:06:04.033045 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-zq7mp_aebddcf8-77ce-4317-94c3-f29b45f93686/manager/0.log" Nov 25 17:06:04 crc kubenswrapper[4743]: I1125 17:06:04.039085 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-zq7mp_aebddcf8-77ce-4317-94c3-f29b45f93686/kube-rbac-proxy/0.log" Nov 25 17:06:04 crc kubenswrapper[4743]: I1125 17:06:04.158955 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-zdjj7_d0cd465a-f903-48ef-aca1-839a390d3f12/kube-rbac-proxy/0.log" Nov 25 17:06:04 crc kubenswrapper[4743]: I1125 17:06:04.338694 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-zdjj7_d0cd465a-f903-48ef-aca1-839a390d3f12/manager/0.log" Nov 25 17:06:04 crc kubenswrapper[4743]: I1125 17:06:04.351149 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-f7g4h_38527a3c-d051-4354-a8ca-0692153762f1/kube-rbac-proxy/0.log" Nov 25 17:06:04 crc kubenswrapper[4743]: I1125 17:06:04.381014 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-f7g4h_38527a3c-d051-4354-a8ca-0692153762f1/manager/0.log" Nov 25 17:06:04 crc kubenswrapper[4743]: I1125 17:06:04.553685 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-z5rhv_bc109d32-7111-40b6-aff6-7596c933114f/kube-rbac-proxy/0.log" Nov 25 17:06:04 crc kubenswrapper[4743]: I1125 17:06:04.593245 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-z5rhv_bc109d32-7111-40b6-aff6-7596c933114f/manager/0.log" Nov 25 17:06:04 crc kubenswrapper[4743]: I1125 17:06:04.657447 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-8s47h_9caca3f1-e43f-47ab-aa8a-1248a30cfda4/kube-rbac-proxy/0.log" Nov 25 17:06:04 crc kubenswrapper[4743]: I1125 17:06:04.742099 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-8s47h_9caca3f1-e43f-47ab-aa8a-1248a30cfda4/manager/0.log" Nov 25 17:06:04 crc kubenswrapper[4743]: I1125 17:06:04.777489 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-djxpp_9f422105-6959-44e5-93e2-901fd9b84dfc/kube-rbac-proxy/0.log" Nov 25 17:06:04 crc kubenswrapper[4743]: I1125 17:06:04.857542 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-djxpp_9f422105-6959-44e5-93e2-901fd9b84dfc/manager/0.log" Nov 25 17:06:04 crc kubenswrapper[4743]: I1125 17:06:04.955004 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-j9zq7_8d418847-cf8f-4977-bc14-3d4b64591e68/kube-rbac-proxy/0.log" Nov 25 17:06:04 crc kubenswrapper[4743]: I1125 17:06:04.990394 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-j9zq7_8d418847-cf8f-4977-bc14-3d4b64591e68/manager/0.log" Nov 25 17:06:05 crc kubenswrapper[4743]: I1125 17:06:05.126677 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-bxdjg_88757539-b3d4-4de5-bc96-a4cd13d5a203/kube-rbac-proxy/0.log" Nov 25 17:06:05 crc kubenswrapper[4743]: I1125 17:06:05.253125 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-bxdjg_88757539-b3d4-4de5-bc96-a4cd13d5a203/manager/0.log" Nov 25 17:06:05 crc kubenswrapper[4743]: I1125 17:06:05.307028 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-bvwmw_44e2f27d-a5d4-48cf-90f5-2f5598a2295a/manager/0.log" Nov 25 17:06:05 crc kubenswrapper[4743]: I1125 17:06:05.330356 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-bvwmw_44e2f27d-a5d4-48cf-90f5-2f5598a2295a/kube-rbac-proxy/0.log" Nov 25 17:06:05 crc kubenswrapper[4743]: I1125 17:06:05.451752 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8_94418bc2-d439-451f-91c2-c457a200825e/kube-rbac-proxy/0.log" Nov 25 17:06:05 crc kubenswrapper[4743]: I1125 17:06:05.512653 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-544b9bb9-qsrn8_94418bc2-d439-451f-91c2-c457a200825e/manager/0.log" Nov 25 17:06:06 crc kubenswrapper[4743]: I1125 17:06:06.070205 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-9gt7m_7413a348-450f-4717-a52f-595041381991/registry-server/0.log" Nov 25 17:06:06 crc kubenswrapper[4743]: I1125 17:06:06.115436 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-59fdcdbdd4-vrqql_e289302a-7d4a-4b30-94fe-5babb338505d/operator/0.log" Nov 25 17:06:06 crc kubenswrapper[4743]: I1125 17:06:06.300405 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-mwm9p_e0a0f65a-b18b-479e-8ef8-5f7c6c36ccdf/kube-rbac-proxy/0.log" Nov 25 17:06:06 crc kubenswrapper[4743]: I1125 17:06:06.406646 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-mwm9p_e0a0f65a-b18b-479e-8ef8-5f7c6c36ccdf/manager/0.log" Nov 25 17:06:06 crc kubenswrapper[4743]: I1125 17:06:06.547105 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-g5bp8_7be9f6fc-3582-4e14-a452-daa24035d10e/kube-rbac-proxy/0.log" Nov 25 17:06:06 crc kubenswrapper[4743]: I1125 17:06:06.565477 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-g5bp8_7be9f6fc-3582-4e14-a452-daa24035d10e/manager/0.log" Nov 25 17:06:06 crc kubenswrapper[4743]: I1125 17:06:06.677000 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-g4sq7_66ead12f-65d1-4438-80b0-1a747105d7fc/operator/0.log" Nov 25 17:06:06 crc kubenswrapper[4743]: I1125 17:06:06.804848 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-smsbr_e2417720-74c0-4232-9f99-cdc10e485c91/kube-rbac-proxy/0.log" Nov 25 17:06:06 crc kubenswrapper[4743]: I1125 17:06:06.927107 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-smsbr_e2417720-74c0-4232-9f99-cdc10e485c91/manager/0.log" Nov 25 17:06:06 crc kubenswrapper[4743]: I1125 17:06:06.953038 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-746c9d5b4f-z2hm7_20c829b2-be6f-4f96-85c1-21279d871c99/manager/0.log" Nov 25 17:06:07 crc kubenswrapper[4743]: I1125 17:06:07.004203 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-8lsj8_d4e33a37-ac1e-408c-b0d3-a1352daa67af/kube-rbac-proxy/0.log" Nov 25 17:06:07 crc kubenswrapper[4743]: I1125 17:06:07.110910 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-xbxjh_05a605e3-814f-45a4-8461-47cbb3330652/kube-rbac-proxy/0.log" Nov 25 17:06:07 crc kubenswrapper[4743]: I1125 17:06:07.157207 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-8lsj8_d4e33a37-ac1e-408c-b0d3-a1352daa67af/manager/0.log" Nov 25 17:06:07 crc kubenswrapper[4743]: I1125 17:06:07.170847 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-xbxjh_05a605e3-814f-45a4-8461-47cbb3330652/manager/0.log" Nov 25 17:06:07 crc kubenswrapper[4743]: I1125 17:06:07.304891 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-pjmtx_5a86bde8-f04d-4bfa-842f-6c960d7232fb/kube-rbac-proxy/0.log" Nov 25 17:06:07 crc kubenswrapper[4743]: I1125 17:06:07.306010 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-pjmtx_5a86bde8-f04d-4bfa-842f-6c960d7232fb/manager/0.log" Nov 25 17:06:23 crc kubenswrapper[4743]: I1125 17:06:23.853197 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8vr4f_fb60121d-df03-4f88-a9e5-118105c6ce94/control-plane-machine-set-operator/0.log" Nov 25 17:06:23 crc kubenswrapper[4743]: I1125 17:06:23.982636 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-j9hhf_a655856c-3900-4342-a094-dc03b84c8876/kube-rbac-proxy/0.log" Nov 25 17:06:24 crc kubenswrapper[4743]: I1125 17:06:24.020131 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-j9hhf_a655856c-3900-4342-a094-dc03b84c8876/machine-api-operator/0.log" Nov 25 17:06:36 crc kubenswrapper[4743]: I1125 17:06:36.944381 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-cn9d5_779c1d2b-063b-413b-80c1-63c1b5438aff/cert-manager-controller/0.log" Nov 25 17:06:37 crc kubenswrapper[4743]: I1125 17:06:37.029239 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-s8482_6afbc225-6b21-4fe7-80a6-9fe85ffcac89/cert-manager-cainjector/0.log" Nov 25 17:06:37 crc kubenswrapper[4743]: I1125 17:06:37.145331 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-clc4q_9e38fcf4-5a14-44c4-b8ad-970d07e82284/cert-manager-webhook/0.log" Nov 25 17:06:50 crc kubenswrapper[4743]: I1125 17:06:50.077122 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 17:06:50 crc kubenswrapper[4743]: I1125 17:06:50.077686 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 17:06:50 crc kubenswrapper[4743]: I1125 17:06:50.825509 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-zhqdb_994ed247-8a08-4164-89b4-c03a90c4ef5d/nmstate-console-plugin/0.log" Nov 25 17:06:51 crc kubenswrapper[4743]: I1125 17:06:51.033642 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6cznj_25afedc8-ae76-4c86-aeaa-c739b1458040/nmstate-handler/0.log" Nov 25 17:06:51 crc kubenswrapper[4743]: I1125 17:06:51.035569 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-7q45f_62b25135-7567-4053-ab8a-5df129154693/kube-rbac-proxy/0.log" Nov 25 17:06:51 crc kubenswrapper[4743]: I1125 17:06:51.080545 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-7q45f_62b25135-7567-4053-ab8a-5df129154693/nmstate-metrics/0.log" Nov 25 17:06:51 crc kubenswrapper[4743]: I1125 17:06:51.262493 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-6mtvn_a3c55fbc-7f41-4fe1-b7cf-9b5476c4c1ae/nmstate-operator/0.log" Nov 25 17:06:51 crc kubenswrapper[4743]: I1125 17:06:51.305044 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-clgq2_b423b0b1-b7c2-4a09-a332-cc9c03bfca51/nmstate-webhook/0.log" Nov 25 17:07:05 crc kubenswrapper[4743]: I1125 17:07:05.741296 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-685f4_d805cc14-bb31-4762-9079-dedb5e33e391/kube-rbac-proxy/0.log" Nov 25 17:07:05 crc kubenswrapper[4743]: I1125 17:07:05.876297 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-685f4_d805cc14-bb31-4762-9079-dedb5e33e391/controller/0.log" Nov 25 17:07:05 crc kubenswrapper[4743]: I1125 17:07:05.941744 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-frr-files/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.134907 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-reloader/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.140522 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-metrics/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.155828 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-reloader/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.171319 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-frr-files/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.342325 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-frr-files/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.369508 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-reloader/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.412429 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-metrics/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.426758 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-metrics/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.567836 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-reloader/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.585818 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-frr-files/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.614738 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/controller/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.627706 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/cp-metrics/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.726616 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/frr-metrics/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.779415 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/kube-rbac-proxy/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.823302 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/kube-rbac-proxy-frr/0.log" Nov 25 17:07:06 crc kubenswrapper[4743]: I1125 17:07:06.908349 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/reloader/0.log" Nov 25 17:07:07 crc kubenswrapper[4743]: I1125 17:07:07.000689 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-q2tm6_77af4b9b-ed3c-4b08-ab42-bd5b3c69cdec/frr-k8s-webhook-server/0.log" Nov 25 17:07:07 crc kubenswrapper[4743]: I1125 17:07:07.178621 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-b6fb67cbb-wpj49_03ed5f22-b285-4560-8572-798606c90e7b/manager/0.log" Nov 25 17:07:07 crc kubenswrapper[4743]: I1125 17:07:07.335989 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-84c6f5f694-f9nf8_bed7e486-cad7-437c-8196-4fc08dd20eb6/webhook-server/0.log" Nov 25 17:07:07 crc kubenswrapper[4743]: I1125 17:07:07.495262 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8f677_fd919f22-093b-4ba9-bbc1-06a5360f6f32/kube-rbac-proxy/0.log" Nov 25 17:07:08 crc kubenswrapper[4743]: I1125 17:07:08.045969 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8f677_fd919f22-093b-4ba9-bbc1-06a5360f6f32/speaker/0.log" Nov 25 17:07:08 crc kubenswrapper[4743]: I1125 17:07:08.151828 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nv4nd_ddf4605c-5031-4d69-9b22-a49126d26f66/frr/0.log" Nov 25 17:07:19 crc kubenswrapper[4743]: I1125 17:07:19.941184 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt_8f0d30a7-3fb8-4595-a303-9490f5a78667/util/0.log" Nov 25 17:07:20 crc kubenswrapper[4743]: I1125 17:07:20.062910 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt_8f0d30a7-3fb8-4595-a303-9490f5a78667/util/0.log" Nov 25 17:07:20 crc kubenswrapper[4743]: I1125 17:07:20.072794 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt_8f0d30a7-3fb8-4595-a303-9490f5a78667/pull/0.log" Nov 25 17:07:20 crc kubenswrapper[4743]: I1125 17:07:20.077469 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 17:07:20 crc kubenswrapper[4743]: I1125 17:07:20.077518 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 17:07:20 crc kubenswrapper[4743]: I1125 17:07:20.087077 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt_8f0d30a7-3fb8-4595-a303-9490f5a78667/pull/0.log" Nov 25 17:07:20 crc kubenswrapper[4743]: I1125 17:07:20.242364 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt_8f0d30a7-3fb8-4595-a303-9490f5a78667/extract/0.log" Nov 25 17:07:20 crc kubenswrapper[4743]: I1125 17:07:20.266717 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt_8f0d30a7-3fb8-4595-a303-9490f5a78667/pull/0.log" Nov 25 17:07:20 crc kubenswrapper[4743]: I1125 17:07:20.273915 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772eg4wbt_8f0d30a7-3fb8-4595-a303-9490f5a78667/util/0.log" Nov 25 17:07:20 crc kubenswrapper[4743]: I1125 17:07:20.407716 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vz76_48cfc5c0-943f-4d89-80f1-bc08e3c3a589/extract-utilities/0.log" Nov 25 17:07:20 crc kubenswrapper[4743]: I1125 17:07:20.539902 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vz76_48cfc5c0-943f-4d89-80f1-bc08e3c3a589/extract-utilities/0.log" Nov 25 17:07:20 crc kubenswrapper[4743]: I1125 17:07:20.540741 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vz76_48cfc5c0-943f-4d89-80f1-bc08e3c3a589/extract-content/0.log" Nov 25 17:07:20 crc kubenswrapper[4743]: I1125 17:07:20.607447 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vz76_48cfc5c0-943f-4d89-80f1-bc08e3c3a589/extract-content/0.log" Nov 25 17:07:20 crc kubenswrapper[4743]: I1125 17:07:20.722824 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vz76_48cfc5c0-943f-4d89-80f1-bc08e3c3a589/extract-utilities/0.log" Nov 25 17:07:20 crc kubenswrapper[4743]: I1125 17:07:20.742512 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vz76_48cfc5c0-943f-4d89-80f1-bc08e3c3a589/extract-content/0.log" Nov 25 17:07:20 crc kubenswrapper[4743]: I1125 17:07:20.938581 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsk2l_5bf56610-e316-490a-b030-094e92f0f76d/extract-utilities/0.log" Nov 25 17:07:21 crc kubenswrapper[4743]: I1125 17:07:21.160171 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsk2l_5bf56610-e316-490a-b030-094e92f0f76d/extract-content/0.log" Nov 25 17:07:21 crc kubenswrapper[4743]: I1125 17:07:21.168654 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsk2l_5bf56610-e316-490a-b030-094e92f0f76d/extract-content/0.log" Nov 25 17:07:21 crc kubenswrapper[4743]: I1125 17:07:21.244453 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsk2l_5bf56610-e316-490a-b030-094e92f0f76d/extract-utilities/0.log" Nov 25 17:07:21 crc kubenswrapper[4743]: I1125 17:07:21.286142 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9vz76_48cfc5c0-943f-4d89-80f1-bc08e3c3a589/registry-server/0.log" Nov 25 17:07:21 crc kubenswrapper[4743]: I1125 17:07:21.369666 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsk2l_5bf56610-e316-490a-b030-094e92f0f76d/extract-utilities/0.log" Nov 25 17:07:21 crc kubenswrapper[4743]: I1125 17:07:21.378157 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsk2l_5bf56610-e316-490a-b030-094e92f0f76d/extract-content/0.log" Nov 25 17:07:21 crc kubenswrapper[4743]: I1125 17:07:21.527284 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j_25b1c299-a6a1-4afe-a1f8-9c04410a01a0/util/0.log" Nov 25 17:07:21 crc kubenswrapper[4743]: I1125 17:07:21.745125 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j_25b1c299-a6a1-4afe-a1f8-9c04410a01a0/pull/0.log" Nov 25 17:07:21 crc kubenswrapper[4743]: I1125 17:07:21.759048 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j_25b1c299-a6a1-4afe-a1f8-9c04410a01a0/util/0.log" Nov 25 17:07:21 crc kubenswrapper[4743]: I1125 17:07:21.850327 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j_25b1c299-a6a1-4afe-a1f8-9c04410a01a0/pull/0.log" Nov 25 17:07:21 crc kubenswrapper[4743]: I1125 17:07:21.962601 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j_25b1c299-a6a1-4afe-a1f8-9c04410a01a0/util/0.log" Nov 25 17:07:22 crc kubenswrapper[4743]: I1125 17:07:22.014124 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j_25b1c299-a6a1-4afe-a1f8-9c04410a01a0/pull/0.log" Nov 25 17:07:22 crc kubenswrapper[4743]: I1125 17:07:22.074806 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lnc8j_25b1c299-a6a1-4afe-a1f8-9c04410a01a0/extract/0.log" Nov 25 17:07:22 crc kubenswrapper[4743]: I1125 17:07:22.217854 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hsk2l_5bf56610-e316-490a-b030-094e92f0f76d/registry-server/0.log" Nov 25 17:07:22 crc kubenswrapper[4743]: I1125 17:07:22.291744 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-r4ckx_84c01433-ed4b-4b70-8473-7905b701f657/marketplace-operator/0.log" Nov 25 17:07:22 crc kubenswrapper[4743]: I1125 17:07:22.378980 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kbcvk_033a7590-8333-4c20-8f6a-71c2f7410c3f/extract-utilities/0.log" Nov 25 17:07:22 crc kubenswrapper[4743]: I1125 17:07:22.532161 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kbcvk_033a7590-8333-4c20-8f6a-71c2f7410c3f/extract-content/0.log" Nov 25 17:07:22 crc kubenswrapper[4743]: I1125 17:07:22.533777 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kbcvk_033a7590-8333-4c20-8f6a-71c2f7410c3f/extract-utilities/0.log" Nov 25 17:07:22 crc kubenswrapper[4743]: I1125 17:07:22.544150 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kbcvk_033a7590-8333-4c20-8f6a-71c2f7410c3f/extract-content/0.log" Nov 25 17:07:22 crc kubenswrapper[4743]: I1125 17:07:22.706340 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kbcvk_033a7590-8333-4c20-8f6a-71c2f7410c3f/extract-utilities/0.log" Nov 25 17:07:22 crc kubenswrapper[4743]: I1125 17:07:22.713722 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kbcvk_033a7590-8333-4c20-8f6a-71c2f7410c3f/extract-content/0.log" Nov 25 17:07:22 crc kubenswrapper[4743]: I1125 17:07:22.843506 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kbcvk_033a7590-8333-4c20-8f6a-71c2f7410c3f/registry-server/0.log" Nov 25 17:07:22 crc kubenswrapper[4743]: I1125 17:07:22.921629 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gmt44_f01f0e90-72f1-4251-b010-4f32a5ba0741/extract-utilities/0.log" Nov 25 17:07:23 crc kubenswrapper[4743]: I1125 17:07:23.018213 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gmt44_f01f0e90-72f1-4251-b010-4f32a5ba0741/extract-utilities/0.log" Nov 25 17:07:23 crc kubenswrapper[4743]: I1125 17:07:23.887530 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gmt44_f01f0e90-72f1-4251-b010-4f32a5ba0741/extract-content/0.log" Nov 25 17:07:23 crc kubenswrapper[4743]: I1125 17:07:23.914197 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gmt44_f01f0e90-72f1-4251-b010-4f32a5ba0741/extract-content/0.log" Nov 25 17:07:24 crc kubenswrapper[4743]: I1125 17:07:24.097811 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gmt44_f01f0e90-72f1-4251-b010-4f32a5ba0741/extract-content/0.log" Nov 25 17:07:24 crc kubenswrapper[4743]: I1125 17:07:24.099992 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gmt44_f01f0e90-72f1-4251-b010-4f32a5ba0741/extract-utilities/0.log" Nov 25 17:07:24 crc kubenswrapper[4743]: I1125 17:07:24.678623 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gmt44_f01f0e90-72f1-4251-b010-4f32a5ba0741/registry-server/0.log" Nov 25 17:07:50 crc kubenswrapper[4743]: I1125 17:07:50.076844 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 17:07:50 crc kubenswrapper[4743]: I1125 17:07:50.077348 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 17:07:50 crc kubenswrapper[4743]: I1125 17:07:50.077392 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 17:07:50 crc kubenswrapper[4743]: I1125 17:07:50.078090 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"df21c8ad78bde4aea979a77528322442108122601a0badf468fa0875586a9ad5"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 17:07:50 crc kubenswrapper[4743]: I1125 17:07:50.078146 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://df21c8ad78bde4aea979a77528322442108122601a0badf468fa0875586a9ad5" gracePeriod=600 Nov 25 17:07:50 crc kubenswrapper[4743]: E1125 17:07:50.242735 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73c29847_f70f_4ab1_9691_685966384446.slice/crio-df21c8ad78bde4aea979a77528322442108122601a0badf468fa0875586a9ad5.scope\": RecentStats: unable to find data in memory cache]" Nov 25 17:07:50 crc kubenswrapper[4743]: I1125 17:07:50.750705 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="df21c8ad78bde4aea979a77528322442108122601a0badf468fa0875586a9ad5" exitCode=0 Nov 25 17:07:50 crc kubenswrapper[4743]: I1125 17:07:50.750785 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"df21c8ad78bde4aea979a77528322442108122601a0badf468fa0875586a9ad5"} Nov 25 17:07:50 crc kubenswrapper[4743]: I1125 17:07:50.751372 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerStarted","Data":"2bd47eba6c82e8e3ee490261147932f3c681a7af391c94a856790f2b409a85e3"} Nov 25 17:07:50 crc kubenswrapper[4743]: I1125 17:07:50.751400 4743 scope.go:117] "RemoveContainer" containerID="919d9e240db2ca8d85c907bfb1dc36268249ea174e558ce91092f2437d744d76" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.052603 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hlxmq"] Nov 25 17:08:42 crc kubenswrapper[4743]: E1125 17:08:42.053517 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94445ed5-4922-4db4-87a6-0290fc0ce4d6" containerName="extract-content" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.053543 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="94445ed5-4922-4db4-87a6-0290fc0ce4d6" containerName="extract-content" Nov 25 17:08:42 crc kubenswrapper[4743]: E1125 17:08:42.053561 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94445ed5-4922-4db4-87a6-0290fc0ce4d6" containerName="registry-server" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.053567 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="94445ed5-4922-4db4-87a6-0290fc0ce4d6" containerName="registry-server" Nov 25 17:08:42 crc kubenswrapper[4743]: E1125 17:08:42.053612 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94445ed5-4922-4db4-87a6-0290fc0ce4d6" containerName="extract-utilities" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.053619 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="94445ed5-4922-4db4-87a6-0290fc0ce4d6" containerName="extract-utilities" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.053806 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="94445ed5-4922-4db4-87a6-0290fc0ce4d6" containerName="registry-server" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.061531 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.063055 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-catalog-content\") pod \"redhat-marketplace-hlxmq\" (UID: \"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb\") " pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.063188 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzvk2\" (UniqueName: \"kubernetes.io/projected/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-kube-api-access-kzvk2\") pod \"redhat-marketplace-hlxmq\" (UID: \"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb\") " pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.063454 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-utilities\") pod \"redhat-marketplace-hlxmq\" (UID: \"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb\") " pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.069150 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hlxmq"] Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.165176 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-utilities\") pod \"redhat-marketplace-hlxmq\" (UID: \"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb\") " pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.165361 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-catalog-content\") pod \"redhat-marketplace-hlxmq\" (UID: \"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb\") " pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.165444 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzvk2\" (UniqueName: \"kubernetes.io/projected/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-kube-api-access-kzvk2\") pod \"redhat-marketplace-hlxmq\" (UID: \"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb\") " pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.166982 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-utilities\") pod \"redhat-marketplace-hlxmq\" (UID: \"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb\") " pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.167019 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-catalog-content\") pod \"redhat-marketplace-hlxmq\" (UID: \"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb\") " pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.185432 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzvk2\" (UniqueName: \"kubernetes.io/projected/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-kube-api-access-kzvk2\") pod \"redhat-marketplace-hlxmq\" (UID: \"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb\") " pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.392775 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:42 crc kubenswrapper[4743]: I1125 17:08:42.826631 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hlxmq"] Nov 25 17:08:42 crc kubenswrapper[4743]: W1125 17:08:42.838306 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9b8c218_4dd6_4c3a_85ff_595df88a1bfb.slice/crio-f7f3c716901467441e3d082485aa952c876562311622dcd07f11c4bae3b3a41f WatchSource:0}: Error finding container f7f3c716901467441e3d082485aa952c876562311622dcd07f11c4bae3b3a41f: Status 404 returned error can't find the container with id f7f3c716901467441e3d082485aa952c876562311622dcd07f11c4bae3b3a41f Nov 25 17:08:43 crc kubenswrapper[4743]: I1125 17:08:43.270925 4743 generic.go:334] "Generic (PLEG): container finished" podID="a9b8c218-4dd6-4c3a-85ff-595df88a1bfb" containerID="efb868d06d7a66f11c92e285b6ed7baa23aae7b73bea9e0cd7a633701bc839e8" exitCode=0 Nov 25 17:08:43 crc kubenswrapper[4743]: I1125 17:08:43.272000 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hlxmq" event={"ID":"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb","Type":"ContainerDied","Data":"efb868d06d7a66f11c92e285b6ed7baa23aae7b73bea9e0cd7a633701bc839e8"} Nov 25 17:08:43 crc kubenswrapper[4743]: I1125 17:08:43.272048 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hlxmq" event={"ID":"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb","Type":"ContainerStarted","Data":"f7f3c716901467441e3d082485aa952c876562311622dcd07f11c4bae3b3a41f"} Nov 25 17:08:45 crc kubenswrapper[4743]: I1125 17:08:45.292460 4743 generic.go:334] "Generic (PLEG): container finished" podID="a9b8c218-4dd6-4c3a-85ff-595df88a1bfb" containerID="b8e53692a78a42890b2e5aa25eb73e5c38ce180832d2a1fca75cc72fb009bcc3" exitCode=0 Nov 25 17:08:45 crc kubenswrapper[4743]: I1125 17:08:45.292519 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hlxmq" event={"ID":"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb","Type":"ContainerDied","Data":"b8e53692a78a42890b2e5aa25eb73e5c38ce180832d2a1fca75cc72fb009bcc3"} Nov 25 17:08:46 crc kubenswrapper[4743]: I1125 17:08:46.302815 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hlxmq" event={"ID":"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb","Type":"ContainerStarted","Data":"d3d0f30e06690e6d2f95c4ec6334f8ed7d5579cf9bcfe2b0428697af0a9d9114"} Nov 25 17:08:46 crc kubenswrapper[4743]: I1125 17:08:46.326922 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hlxmq" podStartSLOduration=1.894810653 podStartE2EDuration="4.326898067s" podCreationTimestamp="2025-11-25 17:08:42 +0000 UTC" firstStartedPulling="2025-11-25 17:08:43.273666972 +0000 UTC m=+4202.395506521" lastFinishedPulling="2025-11-25 17:08:45.705754386 +0000 UTC m=+4204.827593935" observedRunningTime="2025-11-25 17:08:46.319264387 +0000 UTC m=+4205.441103956" watchObservedRunningTime="2025-11-25 17:08:46.326898067 +0000 UTC m=+4205.448737616" Nov 25 17:08:52 crc kubenswrapper[4743]: I1125 17:08:52.393373 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:52 crc kubenswrapper[4743]: I1125 17:08:52.394234 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:52 crc kubenswrapper[4743]: I1125 17:08:52.453607 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:53 crc kubenswrapper[4743]: I1125 17:08:53.458561 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:53 crc kubenswrapper[4743]: I1125 17:08:53.512638 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hlxmq"] Nov 25 17:08:55 crc kubenswrapper[4743]: I1125 17:08:55.394625 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hlxmq" podUID="a9b8c218-4dd6-4c3a-85ff-595df88a1bfb" containerName="registry-server" containerID="cri-o://d3d0f30e06690e6d2f95c4ec6334f8ed7d5579cf9bcfe2b0428697af0a9d9114" gracePeriod=2 Nov 25 17:08:56 crc kubenswrapper[4743]: I1125 17:08:56.407906 4743 generic.go:334] "Generic (PLEG): container finished" podID="a9b8c218-4dd6-4c3a-85ff-595df88a1bfb" containerID="d3d0f30e06690e6d2f95c4ec6334f8ed7d5579cf9bcfe2b0428697af0a9d9114" exitCode=0 Nov 25 17:08:56 crc kubenswrapper[4743]: I1125 17:08:56.408198 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hlxmq" event={"ID":"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb","Type":"ContainerDied","Data":"d3d0f30e06690e6d2f95c4ec6334f8ed7d5579cf9bcfe2b0428697af0a9d9114"} Nov 25 17:08:56 crc kubenswrapper[4743]: I1125 17:08:56.408229 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hlxmq" event={"ID":"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb","Type":"ContainerDied","Data":"f7f3c716901467441e3d082485aa952c876562311622dcd07f11c4bae3b3a41f"} Nov 25 17:08:56 crc kubenswrapper[4743]: I1125 17:08:56.408244 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7f3c716901467441e3d082485aa952c876562311622dcd07f11c4bae3b3a41f" Nov 25 17:08:56 crc kubenswrapper[4743]: I1125 17:08:56.489302 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:56 crc kubenswrapper[4743]: I1125 17:08:56.644043 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-catalog-content\") pod \"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb\" (UID: \"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb\") " Nov 25 17:08:56 crc kubenswrapper[4743]: I1125 17:08:56.644252 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzvk2\" (UniqueName: \"kubernetes.io/projected/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-kube-api-access-kzvk2\") pod \"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb\" (UID: \"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb\") " Nov 25 17:08:56 crc kubenswrapper[4743]: I1125 17:08:56.644570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-utilities\") pod \"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb\" (UID: \"a9b8c218-4dd6-4c3a-85ff-595df88a1bfb\") " Nov 25 17:08:56 crc kubenswrapper[4743]: I1125 17:08:56.645387 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-utilities" (OuterVolumeSpecName: "utilities") pod "a9b8c218-4dd6-4c3a-85ff-595df88a1bfb" (UID: "a9b8c218-4dd6-4c3a-85ff-595df88a1bfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 17:08:56 crc kubenswrapper[4743]: I1125 17:08:56.653922 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-kube-api-access-kzvk2" (OuterVolumeSpecName: "kube-api-access-kzvk2") pod "a9b8c218-4dd6-4c3a-85ff-595df88a1bfb" (UID: "a9b8c218-4dd6-4c3a-85ff-595df88a1bfb"). InnerVolumeSpecName "kube-api-access-kzvk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 17:08:56 crc kubenswrapper[4743]: I1125 17:08:56.692015 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9b8c218-4dd6-4c3a-85ff-595df88a1bfb" (UID: "a9b8c218-4dd6-4c3a-85ff-595df88a1bfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 17:08:56 crc kubenswrapper[4743]: I1125 17:08:56.748423 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzvk2\" (UniqueName: \"kubernetes.io/projected/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-kube-api-access-kzvk2\") on node \"crc\" DevicePath \"\"" Nov 25 17:08:56 crc kubenswrapper[4743]: I1125 17:08:56.748462 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 17:08:56 crc kubenswrapper[4743]: I1125 17:08:56.748472 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 17:08:57 crc kubenswrapper[4743]: I1125 17:08:57.416449 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hlxmq" Nov 25 17:08:57 crc kubenswrapper[4743]: I1125 17:08:57.458100 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hlxmq"] Nov 25 17:08:57 crc kubenswrapper[4743]: I1125 17:08:57.475051 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hlxmq"] Nov 25 17:08:57 crc kubenswrapper[4743]: I1125 17:08:57.788355 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9b8c218-4dd6-4c3a-85ff-595df88a1bfb" path="/var/lib/kubelet/pods/a9b8c218-4dd6-4c3a-85ff-595df88a1bfb/volumes" Nov 25 17:09:01 crc kubenswrapper[4743]: I1125 17:09:01.455585 4743 generic.go:334] "Generic (PLEG): container finished" podID="2339ebd2-9b75-4640-b625-8ce96fbfc1e5" containerID="bb131daf59033c4f22a31c934aac6e49dc0cbd669044c4169987a9d8f66b6027" exitCode=0 Nov 25 17:09:01 crc kubenswrapper[4743]: I1125 17:09:01.455668 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jt59v/must-gather-lzb6c" event={"ID":"2339ebd2-9b75-4640-b625-8ce96fbfc1e5","Type":"ContainerDied","Data":"bb131daf59033c4f22a31c934aac6e49dc0cbd669044c4169987a9d8f66b6027"} Nov 25 17:09:01 crc kubenswrapper[4743]: I1125 17:09:01.456759 4743 scope.go:117] "RemoveContainer" containerID="bb131daf59033c4f22a31c934aac6e49dc0cbd669044c4169987a9d8f66b6027" Nov 25 17:09:01 crc kubenswrapper[4743]: I1125 17:09:01.738129 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jt59v_must-gather-lzb6c_2339ebd2-9b75-4640-b625-8ce96fbfc1e5/gather/0.log" Nov 25 17:09:11 crc kubenswrapper[4743]: I1125 17:09:11.200568 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jt59v/must-gather-lzb6c"] Nov 25 17:09:11 crc kubenswrapper[4743]: I1125 17:09:11.201367 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jt59v/must-gather-lzb6c" podUID="2339ebd2-9b75-4640-b625-8ce96fbfc1e5" containerName="copy" containerID="cri-o://303d2387ef7f2b3aee8d643b7e016001e50e5fe70377c00c59505c5e4b0170ca" gracePeriod=2 Nov 25 17:09:11 crc kubenswrapper[4743]: I1125 17:09:11.210333 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jt59v/must-gather-lzb6c"] Nov 25 17:09:11 crc kubenswrapper[4743]: I1125 17:09:11.549420 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jt59v_must-gather-lzb6c_2339ebd2-9b75-4640-b625-8ce96fbfc1e5/copy/0.log" Nov 25 17:09:11 crc kubenswrapper[4743]: I1125 17:09:11.550386 4743 generic.go:334] "Generic (PLEG): container finished" podID="2339ebd2-9b75-4640-b625-8ce96fbfc1e5" containerID="303d2387ef7f2b3aee8d643b7e016001e50e5fe70377c00c59505c5e4b0170ca" exitCode=143 Nov 25 17:09:11 crc kubenswrapper[4743]: I1125 17:09:11.550449 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="229147b156a9da2b6980324974b28092bbc9120846bd6ff8691b8914747e2d3d" Nov 25 17:09:11 crc kubenswrapper[4743]: I1125 17:09:11.601493 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jt59v_must-gather-lzb6c_2339ebd2-9b75-4640-b625-8ce96fbfc1e5/copy/0.log" Nov 25 17:09:11 crc kubenswrapper[4743]: I1125 17:09:11.601943 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/must-gather-lzb6c" Nov 25 17:09:11 crc kubenswrapper[4743]: I1125 17:09:11.630146 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2339ebd2-9b75-4640-b625-8ce96fbfc1e5-must-gather-output\") pod \"2339ebd2-9b75-4640-b625-8ce96fbfc1e5\" (UID: \"2339ebd2-9b75-4640-b625-8ce96fbfc1e5\") " Nov 25 17:09:11 crc kubenswrapper[4743]: I1125 17:09:11.630257 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdlhx\" (UniqueName: \"kubernetes.io/projected/2339ebd2-9b75-4640-b625-8ce96fbfc1e5-kube-api-access-pdlhx\") pod \"2339ebd2-9b75-4640-b625-8ce96fbfc1e5\" (UID: \"2339ebd2-9b75-4640-b625-8ce96fbfc1e5\") " Nov 25 17:09:11 crc kubenswrapper[4743]: I1125 17:09:11.636809 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2339ebd2-9b75-4640-b625-8ce96fbfc1e5-kube-api-access-pdlhx" (OuterVolumeSpecName: "kube-api-access-pdlhx") pod "2339ebd2-9b75-4640-b625-8ce96fbfc1e5" (UID: "2339ebd2-9b75-4640-b625-8ce96fbfc1e5"). InnerVolumeSpecName "kube-api-access-pdlhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 17:09:11 crc kubenswrapper[4743]: I1125 17:09:11.731988 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdlhx\" (UniqueName: \"kubernetes.io/projected/2339ebd2-9b75-4640-b625-8ce96fbfc1e5-kube-api-access-pdlhx\") on node \"crc\" DevicePath \"\"" Nov 25 17:09:11 crc kubenswrapper[4743]: I1125 17:09:11.773723 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2339ebd2-9b75-4640-b625-8ce96fbfc1e5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2339ebd2-9b75-4640-b625-8ce96fbfc1e5" (UID: "2339ebd2-9b75-4640-b625-8ce96fbfc1e5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 17:09:11 crc kubenswrapper[4743]: I1125 17:09:11.786439 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2339ebd2-9b75-4640-b625-8ce96fbfc1e5" path="/var/lib/kubelet/pods/2339ebd2-9b75-4640-b625-8ce96fbfc1e5/volumes" Nov 25 17:09:11 crc kubenswrapper[4743]: I1125 17:09:11.833567 4743 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2339ebd2-9b75-4640-b625-8ce96fbfc1e5-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 25 17:09:12 crc kubenswrapper[4743]: I1125 17:09:12.557011 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jt59v/must-gather-lzb6c" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.598056 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qxfn5"] Nov 25 17:09:16 crc kubenswrapper[4743]: E1125 17:09:16.599644 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b8c218-4dd6-4c3a-85ff-595df88a1bfb" containerName="registry-server" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.599666 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b8c218-4dd6-4c3a-85ff-595df88a1bfb" containerName="registry-server" Nov 25 17:09:16 crc kubenswrapper[4743]: E1125 17:09:16.599682 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2339ebd2-9b75-4640-b625-8ce96fbfc1e5" containerName="gather" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.599707 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2339ebd2-9b75-4640-b625-8ce96fbfc1e5" containerName="gather" Nov 25 17:09:16 crc kubenswrapper[4743]: E1125 17:09:16.599732 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b8c218-4dd6-4c3a-85ff-595df88a1bfb" containerName="extract-utilities" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.599741 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b8c218-4dd6-4c3a-85ff-595df88a1bfb" containerName="extract-utilities" Nov 25 17:09:16 crc kubenswrapper[4743]: E1125 17:09:16.599771 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2339ebd2-9b75-4640-b625-8ce96fbfc1e5" containerName="copy" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.599779 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2339ebd2-9b75-4640-b625-8ce96fbfc1e5" containerName="copy" Nov 25 17:09:16 crc kubenswrapper[4743]: E1125 17:09:16.599794 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b8c218-4dd6-4c3a-85ff-595df88a1bfb" containerName="extract-content" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.599802 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b8c218-4dd6-4c3a-85ff-595df88a1bfb" containerName="extract-content" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.600138 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2339ebd2-9b75-4640-b625-8ce96fbfc1e5" containerName="gather" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.600160 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2339ebd2-9b75-4640-b625-8ce96fbfc1e5" containerName="copy" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.600179 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b8c218-4dd6-4c3a-85ff-595df88a1bfb" containerName="registry-server" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.602172 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.614432 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qxfn5"] Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.626126 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px87m\" (UniqueName: \"kubernetes.io/projected/0b27dac0-5627-411c-93c7-6ec3cb155b6f-kube-api-access-px87m\") pod \"community-operators-qxfn5\" (UID: \"0b27dac0-5627-411c-93c7-6ec3cb155b6f\") " pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.626352 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b27dac0-5627-411c-93c7-6ec3cb155b6f-utilities\") pod \"community-operators-qxfn5\" (UID: \"0b27dac0-5627-411c-93c7-6ec3cb155b6f\") " pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.626460 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b27dac0-5627-411c-93c7-6ec3cb155b6f-catalog-content\") pod \"community-operators-qxfn5\" (UID: \"0b27dac0-5627-411c-93c7-6ec3cb155b6f\") " pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.727065 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px87m\" (UniqueName: \"kubernetes.io/projected/0b27dac0-5627-411c-93c7-6ec3cb155b6f-kube-api-access-px87m\") pod \"community-operators-qxfn5\" (UID: \"0b27dac0-5627-411c-93c7-6ec3cb155b6f\") " pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.727180 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b27dac0-5627-411c-93c7-6ec3cb155b6f-utilities\") pod \"community-operators-qxfn5\" (UID: \"0b27dac0-5627-411c-93c7-6ec3cb155b6f\") " pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.727250 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b27dac0-5627-411c-93c7-6ec3cb155b6f-catalog-content\") pod \"community-operators-qxfn5\" (UID: \"0b27dac0-5627-411c-93c7-6ec3cb155b6f\") " pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.727672 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b27dac0-5627-411c-93c7-6ec3cb155b6f-catalog-content\") pod \"community-operators-qxfn5\" (UID: \"0b27dac0-5627-411c-93c7-6ec3cb155b6f\") " pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.727818 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b27dac0-5627-411c-93c7-6ec3cb155b6f-utilities\") pod \"community-operators-qxfn5\" (UID: \"0b27dac0-5627-411c-93c7-6ec3cb155b6f\") " pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.754179 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px87m\" (UniqueName: \"kubernetes.io/projected/0b27dac0-5627-411c-93c7-6ec3cb155b6f-kube-api-access-px87m\") pod \"community-operators-qxfn5\" (UID: \"0b27dac0-5627-411c-93c7-6ec3cb155b6f\") " pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:16 crc kubenswrapper[4743]: I1125 17:09:16.939464 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:17 crc kubenswrapper[4743]: I1125 17:09:17.477520 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qxfn5"] Nov 25 17:09:18 crc kubenswrapper[4743]: I1125 17:09:18.626573 4743 generic.go:334] "Generic (PLEG): container finished" podID="0b27dac0-5627-411c-93c7-6ec3cb155b6f" containerID="9ba43e4cedc2a25c83967d082e29c4909472acd29ad5b00e5b26e3c4b212d51b" exitCode=0 Nov 25 17:09:18 crc kubenswrapper[4743]: I1125 17:09:18.626684 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxfn5" event={"ID":"0b27dac0-5627-411c-93c7-6ec3cb155b6f","Type":"ContainerDied","Data":"9ba43e4cedc2a25c83967d082e29c4909472acd29ad5b00e5b26e3c4b212d51b"} Nov 25 17:09:18 crc kubenswrapper[4743]: I1125 17:09:18.626943 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxfn5" event={"ID":"0b27dac0-5627-411c-93c7-6ec3cb155b6f","Type":"ContainerStarted","Data":"f1488c8b1af6c0628e9debe850f3e13770b263b097a8ca519c1b142f4a9d8b00"} Nov 25 17:09:20 crc kubenswrapper[4743]: I1125 17:09:20.650241 4743 generic.go:334] "Generic (PLEG): container finished" podID="0b27dac0-5627-411c-93c7-6ec3cb155b6f" containerID="e179d2aeeca8571b5ed6180ace9deb84b993a5ad528a674b724811f1e72345f4" exitCode=0 Nov 25 17:09:20 crc kubenswrapper[4743]: I1125 17:09:20.650324 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxfn5" event={"ID":"0b27dac0-5627-411c-93c7-6ec3cb155b6f","Type":"ContainerDied","Data":"e179d2aeeca8571b5ed6180ace9deb84b993a5ad528a674b724811f1e72345f4"} Nov 25 17:09:21 crc kubenswrapper[4743]: I1125 17:09:21.661077 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxfn5" event={"ID":"0b27dac0-5627-411c-93c7-6ec3cb155b6f","Type":"ContainerStarted","Data":"77beb3440d9b7825941c0409fda617180ae64c8305dcabe90dae0edefbc84138"} Nov 25 17:09:21 crc kubenswrapper[4743]: I1125 17:09:21.685208 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qxfn5" podStartSLOduration=3.293584238 podStartE2EDuration="5.685188292s" podCreationTimestamp="2025-11-25 17:09:16 +0000 UTC" firstStartedPulling="2025-11-25 17:09:18.631376618 +0000 UTC m=+4237.753216167" lastFinishedPulling="2025-11-25 17:09:21.022980652 +0000 UTC m=+4240.144820221" observedRunningTime="2025-11-25 17:09:21.682390584 +0000 UTC m=+4240.804230133" watchObservedRunningTime="2025-11-25 17:09:21.685188292 +0000 UTC m=+4240.807027841" Nov 25 17:09:26 crc kubenswrapper[4743]: I1125 17:09:26.954227 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:26 crc kubenswrapper[4743]: I1125 17:09:26.955033 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:27 crc kubenswrapper[4743]: I1125 17:09:27.003677 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:27 crc kubenswrapper[4743]: I1125 17:09:27.751683 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:27 crc kubenswrapper[4743]: I1125 17:09:27.796983 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qxfn5"] Nov 25 17:09:29 crc kubenswrapper[4743]: I1125 17:09:29.726304 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qxfn5" podUID="0b27dac0-5627-411c-93c7-6ec3cb155b6f" containerName="registry-server" containerID="cri-o://77beb3440d9b7825941c0409fda617180ae64c8305dcabe90dae0edefbc84138" gracePeriod=2 Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.628389 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.739315 4743 generic.go:334] "Generic (PLEG): container finished" podID="0b27dac0-5627-411c-93c7-6ec3cb155b6f" containerID="77beb3440d9b7825941c0409fda617180ae64c8305dcabe90dae0edefbc84138" exitCode=0 Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.739392 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxfn5" event={"ID":"0b27dac0-5627-411c-93c7-6ec3cb155b6f","Type":"ContainerDied","Data":"77beb3440d9b7825941c0409fda617180ae64c8305dcabe90dae0edefbc84138"} Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.739409 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxfn5" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.739514 4743 scope.go:117] "RemoveContainer" containerID="77beb3440d9b7825941c0409fda617180ae64c8305dcabe90dae0edefbc84138" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.739495 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxfn5" event={"ID":"0b27dac0-5627-411c-93c7-6ec3cb155b6f","Type":"ContainerDied","Data":"f1488c8b1af6c0628e9debe850f3e13770b263b097a8ca519c1b142f4a9d8b00"} Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.758162 4743 scope.go:117] "RemoveContainer" containerID="e179d2aeeca8571b5ed6180ace9deb84b993a5ad528a674b724811f1e72345f4" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.777901 4743 scope.go:117] "RemoveContainer" containerID="9ba43e4cedc2a25c83967d082e29c4909472acd29ad5b00e5b26e3c4b212d51b" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.787832 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b27dac0-5627-411c-93c7-6ec3cb155b6f-utilities\") pod \"0b27dac0-5627-411c-93c7-6ec3cb155b6f\" (UID: \"0b27dac0-5627-411c-93c7-6ec3cb155b6f\") " Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.787936 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b27dac0-5627-411c-93c7-6ec3cb155b6f-catalog-content\") pod \"0b27dac0-5627-411c-93c7-6ec3cb155b6f\" (UID: \"0b27dac0-5627-411c-93c7-6ec3cb155b6f\") " Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.787988 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px87m\" (UniqueName: \"kubernetes.io/projected/0b27dac0-5627-411c-93c7-6ec3cb155b6f-kube-api-access-px87m\") pod \"0b27dac0-5627-411c-93c7-6ec3cb155b6f\" (UID: \"0b27dac0-5627-411c-93c7-6ec3cb155b6f\") " Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.788771 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b27dac0-5627-411c-93c7-6ec3cb155b6f-utilities" (OuterVolumeSpecName: "utilities") pod "0b27dac0-5627-411c-93c7-6ec3cb155b6f" (UID: "0b27dac0-5627-411c-93c7-6ec3cb155b6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.800820 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b27dac0-5627-411c-93c7-6ec3cb155b6f-kube-api-access-px87m" (OuterVolumeSpecName: "kube-api-access-px87m") pod "0b27dac0-5627-411c-93c7-6ec3cb155b6f" (UID: "0b27dac0-5627-411c-93c7-6ec3cb155b6f"). InnerVolumeSpecName "kube-api-access-px87m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.845952 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b27dac0-5627-411c-93c7-6ec3cb155b6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b27dac0-5627-411c-93c7-6ec3cb155b6f" (UID: "0b27dac0-5627-411c-93c7-6ec3cb155b6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.859314 4743 scope.go:117] "RemoveContainer" containerID="77beb3440d9b7825941c0409fda617180ae64c8305dcabe90dae0edefbc84138" Nov 25 17:09:30 crc kubenswrapper[4743]: E1125 17:09:30.860085 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77beb3440d9b7825941c0409fda617180ae64c8305dcabe90dae0edefbc84138\": container with ID starting with 77beb3440d9b7825941c0409fda617180ae64c8305dcabe90dae0edefbc84138 not found: ID does not exist" containerID="77beb3440d9b7825941c0409fda617180ae64c8305dcabe90dae0edefbc84138" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.860141 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77beb3440d9b7825941c0409fda617180ae64c8305dcabe90dae0edefbc84138"} err="failed to get container status \"77beb3440d9b7825941c0409fda617180ae64c8305dcabe90dae0edefbc84138\": rpc error: code = NotFound desc = could not find container \"77beb3440d9b7825941c0409fda617180ae64c8305dcabe90dae0edefbc84138\": container with ID starting with 77beb3440d9b7825941c0409fda617180ae64c8305dcabe90dae0edefbc84138 not found: ID does not exist" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.860379 4743 scope.go:117] "RemoveContainer" containerID="e179d2aeeca8571b5ed6180ace9deb84b993a5ad528a674b724811f1e72345f4" Nov 25 17:09:30 crc kubenswrapper[4743]: E1125 17:09:30.860786 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e179d2aeeca8571b5ed6180ace9deb84b993a5ad528a674b724811f1e72345f4\": container with ID starting with e179d2aeeca8571b5ed6180ace9deb84b993a5ad528a674b724811f1e72345f4 not found: ID does not exist" containerID="e179d2aeeca8571b5ed6180ace9deb84b993a5ad528a674b724811f1e72345f4" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.860868 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e179d2aeeca8571b5ed6180ace9deb84b993a5ad528a674b724811f1e72345f4"} err="failed to get container status \"e179d2aeeca8571b5ed6180ace9deb84b993a5ad528a674b724811f1e72345f4\": rpc error: code = NotFound desc = could not find container \"e179d2aeeca8571b5ed6180ace9deb84b993a5ad528a674b724811f1e72345f4\": container with ID starting with e179d2aeeca8571b5ed6180ace9deb84b993a5ad528a674b724811f1e72345f4 not found: ID does not exist" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.860899 4743 scope.go:117] "RemoveContainer" containerID="9ba43e4cedc2a25c83967d082e29c4909472acd29ad5b00e5b26e3c4b212d51b" Nov 25 17:09:30 crc kubenswrapper[4743]: E1125 17:09:30.861145 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ba43e4cedc2a25c83967d082e29c4909472acd29ad5b00e5b26e3c4b212d51b\": container with ID starting with 9ba43e4cedc2a25c83967d082e29c4909472acd29ad5b00e5b26e3c4b212d51b not found: ID does not exist" containerID="9ba43e4cedc2a25c83967d082e29c4909472acd29ad5b00e5b26e3c4b212d51b" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.861187 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba43e4cedc2a25c83967d082e29c4909472acd29ad5b00e5b26e3c4b212d51b"} err="failed to get container status \"9ba43e4cedc2a25c83967d082e29c4909472acd29ad5b00e5b26e3c4b212d51b\": rpc error: code = NotFound desc = could not find container \"9ba43e4cedc2a25c83967d082e29c4909472acd29ad5b00e5b26e3c4b212d51b\": container with ID starting with 9ba43e4cedc2a25c83967d082e29c4909472acd29ad5b00e5b26e3c4b212d51b not found: ID does not exist" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.890051 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b27dac0-5627-411c-93c7-6ec3cb155b6f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.890088 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px87m\" (UniqueName: \"kubernetes.io/projected/0b27dac0-5627-411c-93c7-6ec3cb155b6f-kube-api-access-px87m\") on node \"crc\" DevicePath \"\"" Nov 25 17:09:30 crc kubenswrapper[4743]: I1125 17:09:30.890102 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b27dac0-5627-411c-93c7-6ec3cb155b6f-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 17:09:31 crc kubenswrapper[4743]: I1125 17:09:31.077145 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qxfn5"] Nov 25 17:09:31 crc kubenswrapper[4743]: I1125 17:09:31.087292 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qxfn5"] Nov 25 17:09:31 crc kubenswrapper[4743]: I1125 17:09:31.808638 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b27dac0-5627-411c-93c7-6ec3cb155b6f" path="/var/lib/kubelet/pods/0b27dac0-5627-411c-93c7-6ec3cb155b6f/volumes" Nov 25 17:09:50 crc kubenswrapper[4743]: I1125 17:09:50.077577 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 17:09:50 crc kubenswrapper[4743]: I1125 17:09:50.078239 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 17:10:20 crc kubenswrapper[4743]: I1125 17:10:20.077710 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 17:10:20 crc kubenswrapper[4743]: I1125 17:10:20.078245 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 17:10:50 crc kubenswrapper[4743]: I1125 17:10:50.077178 4743 patch_prober.go:28] interesting pod/machine-config-daemon-f7q7f container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 17:10:50 crc kubenswrapper[4743]: I1125 17:10:50.077774 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 17:10:50 crc kubenswrapper[4743]: I1125 17:10:50.077817 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" Nov 25 17:10:50 crc kubenswrapper[4743]: I1125 17:10:50.078547 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2bd47eba6c82e8e3ee490261147932f3c681a7af391c94a856790f2b409a85e3"} pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 17:10:50 crc kubenswrapper[4743]: I1125 17:10:50.078614 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" containerName="machine-config-daemon" containerID="cri-o://2bd47eba6c82e8e3ee490261147932f3c681a7af391c94a856790f2b409a85e3" gracePeriod=600 Nov 25 17:10:50 crc kubenswrapper[4743]: E1125 17:10:50.422796 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:10:50 crc kubenswrapper[4743]: I1125 17:10:50.478179 4743 generic.go:334] "Generic (PLEG): container finished" podID="73c29847-f70f-4ab1-9691-685966384446" containerID="2bd47eba6c82e8e3ee490261147932f3c681a7af391c94a856790f2b409a85e3" exitCode=0 Nov 25 17:10:50 crc kubenswrapper[4743]: I1125 17:10:50.478270 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" event={"ID":"73c29847-f70f-4ab1-9691-685966384446","Type":"ContainerDied","Data":"2bd47eba6c82e8e3ee490261147932f3c681a7af391c94a856790f2b409a85e3"} Nov 25 17:10:50 crc kubenswrapper[4743]: I1125 17:10:50.478348 4743 scope.go:117] "RemoveContainer" containerID="df21c8ad78bde4aea979a77528322442108122601a0badf468fa0875586a9ad5" Nov 25 17:10:50 crc kubenswrapper[4743]: I1125 17:10:50.479306 4743 scope.go:117] "RemoveContainer" containerID="2bd47eba6c82e8e3ee490261147932f3c681a7af391c94a856790f2b409a85e3" Nov 25 17:10:50 crc kubenswrapper[4743]: E1125 17:10:50.479802 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:11:03 crc kubenswrapper[4743]: I1125 17:11:03.774653 4743 scope.go:117] "RemoveContainer" containerID="2bd47eba6c82e8e3ee490261147932f3c681a7af391c94a856790f2b409a85e3" Nov 25 17:11:03 crc kubenswrapper[4743]: E1125 17:11:03.775393 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:11:11 crc kubenswrapper[4743]: I1125 17:11:11.615294 4743 scope.go:117] "RemoveContainer" containerID="303d2387ef7f2b3aee8d643b7e016001e50e5fe70377c00c59505c5e4b0170ca" Nov 25 17:11:11 crc kubenswrapper[4743]: I1125 17:11:11.639082 4743 scope.go:117] "RemoveContainer" containerID="2ca770be6a947735ceec602d9229936267985ded6f56bbc476a5b324e28d2846" Nov 25 17:11:11 crc kubenswrapper[4743]: I1125 17:11:11.671470 4743 scope.go:117] "RemoveContainer" containerID="bb131daf59033c4f22a31c934aac6e49dc0cbd669044c4169987a9d8f66b6027" Nov 25 17:11:15 crc kubenswrapper[4743]: I1125 17:11:15.775294 4743 scope.go:117] "RemoveContainer" containerID="2bd47eba6c82e8e3ee490261147932f3c681a7af391c94a856790f2b409a85e3" Nov 25 17:11:15 crc kubenswrapper[4743]: E1125 17:11:15.776077 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:11:29 crc kubenswrapper[4743]: I1125 17:11:29.775807 4743 scope.go:117] "RemoveContainer" containerID="2bd47eba6c82e8e3ee490261147932f3c681a7af391c94a856790f2b409a85e3" Nov 25 17:11:29 crc kubenswrapper[4743]: E1125 17:11:29.776533 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:11:43 crc kubenswrapper[4743]: I1125 17:11:43.776666 4743 scope.go:117] "RemoveContainer" containerID="2bd47eba6c82e8e3ee490261147932f3c681a7af391c94a856790f2b409a85e3" Nov 25 17:11:43 crc kubenswrapper[4743]: E1125 17:11:43.778246 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:11:57 crc kubenswrapper[4743]: I1125 17:11:57.775495 4743 scope.go:117] "RemoveContainer" containerID="2bd47eba6c82e8e3ee490261147932f3c681a7af391c94a856790f2b409a85e3" Nov 25 17:11:57 crc kubenswrapper[4743]: E1125 17:11:57.776159 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f7q7f_openshift-machine-config-operator(73c29847-f70f-4ab1-9691-685966384446)\"" pod="openshift-machine-config-operator/machine-config-daemon-f7q7f" podUID="73c29847-f70f-4ab1-9691-685966384446" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.216511 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c9qxj"] Nov 25 17:12:00 crc kubenswrapper[4743]: E1125 17:12:00.218041 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b27dac0-5627-411c-93c7-6ec3cb155b6f" containerName="extract-utilities" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.218065 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b27dac0-5627-411c-93c7-6ec3cb155b6f" containerName="extract-utilities" Nov 25 17:12:00 crc kubenswrapper[4743]: E1125 17:12:00.218107 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b27dac0-5627-411c-93c7-6ec3cb155b6f" containerName="extract-content" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.218116 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b27dac0-5627-411c-93c7-6ec3cb155b6f" containerName="extract-content" Nov 25 17:12:00 crc kubenswrapper[4743]: E1125 17:12:00.218129 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b27dac0-5627-411c-93c7-6ec3cb155b6f" containerName="registry-server" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.218136 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b27dac0-5627-411c-93c7-6ec3cb155b6f" containerName="registry-server" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.218393 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b27dac0-5627-411c-93c7-6ec3cb155b6f" containerName="registry-server" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.219769 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9qxj" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.231136 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c9qxj"] Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.308201 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c-utilities\") pod \"redhat-operators-c9qxj\" (UID: \"bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c\") " pod="openshift-marketplace/redhat-operators-c9qxj" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.308675 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c-catalog-content\") pod \"redhat-operators-c9qxj\" (UID: \"bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c\") " pod="openshift-marketplace/redhat-operators-c9qxj" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.308704 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77czx\" (UniqueName: \"kubernetes.io/projected/bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c-kube-api-access-77czx\") pod \"redhat-operators-c9qxj\" (UID: \"bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c\") " pod="openshift-marketplace/redhat-operators-c9qxj" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.412043 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c-catalog-content\") pod \"redhat-operators-c9qxj\" (UID: \"bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c\") " pod="openshift-marketplace/redhat-operators-c9qxj" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.412362 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77czx\" (UniqueName: \"kubernetes.io/projected/bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c-kube-api-access-77czx\") pod \"redhat-operators-c9qxj\" (UID: \"bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c\") " pod="openshift-marketplace/redhat-operators-c9qxj" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.412795 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c-catalog-content\") pod \"redhat-operators-c9qxj\" (UID: \"bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c\") " pod="openshift-marketplace/redhat-operators-c9qxj" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.413189 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c-utilities\") pod \"redhat-operators-c9qxj\" (UID: \"bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c\") " pod="openshift-marketplace/redhat-operators-c9qxj" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.413469 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c-utilities\") pod \"redhat-operators-c9qxj\" (UID: \"bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c\") " pod="openshift-marketplace/redhat-operators-c9qxj" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.446706 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77czx\" (UniqueName: \"kubernetes.io/projected/bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c-kube-api-access-77czx\") pod \"redhat-operators-c9qxj\" (UID: \"bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c\") " pod="openshift-marketplace/redhat-operators-c9qxj" Nov 25 17:12:00 crc kubenswrapper[4743]: I1125 17:12:00.561737 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9qxj" Nov 25 17:12:01 crc kubenswrapper[4743]: I1125 17:12:01.012819 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c9qxj"] Nov 25 17:12:01 crc kubenswrapper[4743]: I1125 17:12:01.127248 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9qxj" event={"ID":"bcef0c7d-9f23-4ea9-a3a6-20e84cd9067c","Type":"ContainerStarted","Data":"7aed32f8c49ad4b1c47b5f6effc733fac5261d95de2df9f1a4376251e1db7bb3"}